Apr 23 13:29:08.025537 ip-10-0-137-187 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:29:08.025549 ip-10-0-137-187 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:29:08.025556 ip-10-0-137-187 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:29:08.025762 ip-10-0-137-187 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:29:18.035366 ip-10-0-137-187 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:29:18.035389 ip-10-0-137-187 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 70571428c68a489381b374d26d58ce48 -- Apr 23 13:31:29.141103 ip-10-0-137-187 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:31:29.597256 ip-10-0-137-187 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:29.597256 ip-10-0-137-187 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:31:29.597256 ip-10-0-137-187 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:29.597256 ip-10-0-137-187 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:31:29.597256 ip-10-0-137-187 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:29.598385 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.598235 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:31:29.604915 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604892 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:29.604915 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604910 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:29.604915 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604914 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:29.604915 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604918 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:29.604915 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604922 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604925 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604929 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604931 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604934 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604938 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604941 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604944 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604946 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604949 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604952 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604955 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604958 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604961 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604963 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604966 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604968 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604971 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604974 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604976 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:29.605120 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604980 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604982 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604985 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604987 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604990 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604992 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604995 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.604997 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605000 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605003 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605005 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605008 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605010 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605013 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605016 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605019 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605022 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605024 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605027 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605029 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:29.605590 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605032 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605034 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605037 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605040 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605043 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605046 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605049 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605051 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605054 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605057 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605059 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605063 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605066 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605069 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605071 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605074 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605076 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605079 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605082 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:29.606097 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605084 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605087 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605089 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605092 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605095 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605097 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605100 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605103 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605106 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605112 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605116 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605118 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605121 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605124 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605128 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605133 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605137 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605140 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605142 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:29.606553 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605145 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605147 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605150 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605153 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605528 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605532 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605535 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605538 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605540 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605543 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605546 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605549 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605551 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605554 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605556 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605559 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605562 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605564 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605567 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605570 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:29.607039 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605573 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605576 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605579 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605581 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605584 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605588 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605591 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605595 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605599 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605601 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605604 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605606 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605609 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605612 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605615 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605619 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605622 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605624 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605627 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:29.607516 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605629 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605632 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605634 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605637 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605639 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605642 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605645 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605647 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605650 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605652 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605655 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605657 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605660 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605664 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605667 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605669 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605672 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605674 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605677 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:29.608002 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605679 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605682 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605685 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605688 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605690 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605693 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605695 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605698 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605701 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605703 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605706 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605708 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605711 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605713 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605716 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605718 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605721 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605724 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605727 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605730 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:29.608458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605733 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605735 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605738 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605741 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605743 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605745 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605749 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605751 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605754 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605772 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605775 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.605777 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605851 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605859 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605866 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605871 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605875 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605879 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605883 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605888 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605891 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:31:29.608954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605894 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605897 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605901 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605904 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605908 2565 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605910 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605913 2565 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605916 2565 flags.go:64] FLAG: --cloud-config="" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605919 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605922 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605927 2565 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605930 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605933 2565 flags.go:64] FLAG: --config-dir="" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605936 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605939 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605943 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605946 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605950 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605953 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605956 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605960 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605963 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605966 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605969 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605973 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:31:29.609471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605976 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605979 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605982 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605986 2565 flags.go:64] FLAG: --enable-server="true" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605989 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605994 2565 flags.go:64] FLAG: --event-burst="100" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.605997 2565 flags.go:64] FLAG: --event-qps="50" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606000 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606003 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606005 2565 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606009 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606012 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606015 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606019 2565 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606022 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606025 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606028 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606031 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606034 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606037 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606040 2565 flags.go:64] FLAG: --feature-gates="" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606044 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606047 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606050 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606053 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606057 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:31:29.610079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606060 2565 flags.go:64] FLAG: --help="false" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606063 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606066 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606069 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606072 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606076 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606079 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606082 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606085 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606087 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606091 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606094 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606097 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606099 2565 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606103 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606106 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606109 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606112 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606115 2565 flags.go:64] FLAG: --lock-file="" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606117 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606120 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606123 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606129 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:31:29.610713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606132 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606135 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606137 2565 flags.go:64] FLAG: --logging-format="text" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606140 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606144 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606146 2565 flags.go:64] FLAG: --manifest-url="" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606149 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606154 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606158 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606162 2565 flags.go:64] FLAG: --max-pods="110" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606165 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606168 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606171 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606174 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606177 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606180 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606183 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606190 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606193 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606196 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606200 2565 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606203 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606208 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606213 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:31:29.611311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606217 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606220 2565 flags.go:64] FLAG: --port="10250" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606223 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606226 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d53398e922e4aca9" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606229 2565 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606232 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606235 2565 flags.go:64] FLAG: --register-node="true" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606237 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606240 2565 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606244 2565 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606247 2565 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606249 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606252 2565 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606256 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606259 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606262 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606265 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606268 2565 flags.go:64] FLAG: --runonce="false" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606271 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606274 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606277 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606280 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606283 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606286 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606289 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606292 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:31:29.611899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606294 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606297 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606300 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606303 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606306 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606311 2565 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606314 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606319 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606322 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606324 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606328 2565 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606331 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606334 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606337 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606340 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606343 2565 flags.go:64] FLAG: --v="2" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606347 2565 flags.go:64] FLAG: --version="false" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606351 2565 flags.go:64] FLAG: --vmodule="" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606355 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.606359 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606447 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606452 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606457 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606460 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:29.612615 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606463 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606466 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606468 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606471 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606474 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606477 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606479 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606482 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606485 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606487 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606490 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606493 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606496 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606499 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606502 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606504 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606507 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606510 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606512 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606515 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:29.613204 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606518 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606520 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606523 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606525 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606528 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606531 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606534 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606536 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606539 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606542 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606545 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606548 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606551 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606553 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606556 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606559 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606562 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606566 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606570 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:29.613712 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606575 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606586 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606590 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606593 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606596 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606598 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606602 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606605 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606608 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606610 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606613 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606615 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606618 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606620 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606623 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606626 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606628 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606631 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606633 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606636 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:29.614202 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606638 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606641 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606643 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606647 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606650 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606652 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606654 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606657 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606660 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606663 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606665 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606668 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606670 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606673 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606676 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606679 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606681 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606684 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606691 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606694 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:29.614691 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606697 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606700 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.606703 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.607513 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.614107 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.614214 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614262 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614267 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614271 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614274 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614278 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614281 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614284 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614287 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614291 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614294 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:29.615209 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614297 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614300 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614303 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614305 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614308 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614310 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614313 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614316 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614318 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614321 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614323 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614326 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614329 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614331 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614333 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614337 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614339 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614342 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614344 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614347 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:29.615618 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614349 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614353 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614355 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614358 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614361 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614363 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614366 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614368 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614371 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614373 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614376 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614378 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614381 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614384 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614386 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614389 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614391 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614394 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614396 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614399 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:29.616144 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614401 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614404 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614407 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614409 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614412 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614414 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614417 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614421 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614427 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614430 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614433 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614436 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614439 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614444 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614447 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614450 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614452 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614455 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614458 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:29.616632 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614460 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614463 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614466 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614468 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614471 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614474 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614477 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614479 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614482 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614484 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614488 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614492 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614495 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614497 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614500 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614502 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:29.617109 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614505 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.614510 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614613 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614619 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614622 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614625 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614629 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614632 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614635 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614637 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614641 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614646 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614649 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614652 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614655 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:29.617503 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614658 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614661 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614664 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614667 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614670 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614673 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614676 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614678 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614681 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614683 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614686 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614689 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614691 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614694 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614697 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614700 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614704 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614707 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614710 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:29.617897 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614712 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614715 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614718 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614720 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614723 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614730 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614733 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614735 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614738 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614741 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614744 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614746 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614749 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614751 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614754 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614774 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614777 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614780 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614782 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614785 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:29.618362 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614788 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614790 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614793 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614796 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614798 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614801 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614803 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614806 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614808 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614811 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614813 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614816 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614819 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614821 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614824 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614826 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614829 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614831 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614834 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614837 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:29.618858 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614839 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614842 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614845 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614847 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614850 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614852 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614855 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614857 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614860 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614862 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614865 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614867 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614870 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:29.614872 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.614877 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:29.619337 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.614986 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:31:29.619698 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.616962 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:31:29.619698 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.617932 2565 server.go:1019] "Starting client certificate rotation" Apr 23 13:31:29.619698 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.618030 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:29.619698 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.618063 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:29.644645 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.644625 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:29.649265 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.649248 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:29.661885 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.661861 2565 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:31:29.668033 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.668015 2565 log.go:25] "Validated CRI v1 image API" Apr 23 13:31:29.671562 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.671544 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:31:29.675394 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.675373 2565 fs.go:135] Filesystem UUIDs: map[0a172e6e-b95b-4435-aeeb-876a5f9dd6d8:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9f33ae08-d31b-40ad-b755-9c2b8530c762:/dev/nvme0n1p4] Apr 23 13:31:29.675466 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.675393 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:31:29.678131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.678112 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:29.681453 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.681336 2565 manager.go:217] Machine: {Timestamp:2026-04-23 13:31:29.679322934 +0000 UTC m=+0.417866890 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098905 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec268a3bee21db59355c76163d012fd8 SystemUUID:ec268a3b-ee21-db59-355c-76163d012fd8 BootID:70571428-c68a-4893-81b3-74d26d58ce48 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7e:1c:e4:63:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7e:1c:e4:63:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:6e:4d:53:7a:59 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:31:29.681453 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.681447 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:31:29.681583 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.681571 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:31:29.682675 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.682649 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:31:29.682839 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.682677 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-187.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:31:29.682889 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.682848 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:31:29.682889 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.682857 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:31:29.682889 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.682870 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:29.683715 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.683705 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:29.685196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.685187 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:29.685312 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.685303 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:31:29.688100 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.688091 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:31:29.688139 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.688103 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:31:29.688139 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.688114 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:31:29.688139 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.688124 2565 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:31:29.688139 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.688133 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:31:29.689303 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.689291 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:29.689340 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.689310 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:29.692228 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.692212 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:31:29.694173 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.694161 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:31:29.695543 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695530 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:31:29.695578 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695550 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:31:29.695578 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695561 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:31:29.695578 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695568 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695583 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695589 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695595 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695601 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695611 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695617 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695630 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:31:29.695660 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.695638 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:31:29.696571 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.696558 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:31:29.696602 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.696573 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:31:29.700395 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.700381 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:31:29.700454 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.700423 2565 server.go:1295] "Started kubelet" Apr 23 13:31:29.700562 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.700522 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:31:29.700598 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.700585 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:31:29.701284 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.701246 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:31:29.701298 ip-10-0-137-187 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:31:29.702217 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.702094 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:31:29.703211 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.703190 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-187.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:31:29.703309 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.703239 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:31:29.703309 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.703250 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-187.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:31:29.708010 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.707992 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:31:29.711266 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.711249 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:31:29.711433 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.711253 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:29.711815 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.709732 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-187.ec2.internal.18a8ff99f4d5d4bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-187.ec2.internal,UID:ip-10-0-137-187.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-187.ec2.internal,},FirstTimestamp:2026-04-23 13:31:29.700394173 +0000 UTC m=+0.438938131,LastTimestamp:2026-04-23 13:31:29.700394173 +0000 UTC m=+0.438938131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-187.ec2.internal,}" Apr 23 13:31:29.713166 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.712697 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:31:29.713166 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.712950 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:31:29.713166 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.712969 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:31:29.713166 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713079 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:31:29.713166 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713087 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:31:29.713166 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.713151 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:29.713494 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713478 2565 factory.go:55] Registering systemd factory Apr 23 13:31:29.713540 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713509 2565 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:31:29.714002 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713865 2565 factory.go:153] Registering CRI-O factory Apr 23 13:31:29.714002 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713929 2565 factory.go:223] Registration of the crio container factory successfully Apr 23 13:31:29.714002 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.713985 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:31:29.714185 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.714014 2565 factory.go:103] Registering Raw factory Apr 23 13:31:29.714185 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.714030 2565 manager.go:1196] Started watching for new ooms in manager Apr 23 13:31:29.714773 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.714737 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:31:29.714847 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.714811 2565 manager.go:319] Starting recovery of all containers Apr 23 13:31:29.721119 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.721091 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-187.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 13:31:29.721119 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.721102 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 13:31:29.723958 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.723838 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lfqsp" Apr 23 13:31:29.725129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.725116 2565 manager.go:324] Recovery completed Apr 23 13:31:29.729524 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.729504 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:29.731855 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.731837 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lfqsp" Apr 23 13:31:29.732203 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.732189 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:29.732277 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.732220 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:29.732277 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.732236 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:29.732825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.732803 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:31:29.732825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.732824 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:31:29.732975 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.732843 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:29.735428 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.735415 2565 policy_none.go:49] "None policy: Start" Apr 23 13:31:29.735470 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.735432 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:31:29.735470 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.735442 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:31:29.768343 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768322 2565 manager.go:341] "Starting Device Plugin manager" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.768360 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768374 2565 server.go:85] "Starting device plugin registration server" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768648 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768673 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768746 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768837 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.768843 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.769501 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:31:29.784328 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.769528 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:29.833888 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.833850 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:31:29.835034 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.835017 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:31:29.835131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.835040 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:31:29.835131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.835056 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:31:29.835131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.835063 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:31:29.835269 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.835133 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:31:29.837885 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.837865 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:29.869872 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.869822 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:29.870893 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.870875 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:29.870984 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.870905 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:29.870984 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.870917 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:29.870984 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.870947 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.881435 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.881416 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.881488 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.881439 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-187.ec2.internal\": node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:29.899435 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.899415 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:29.935272 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.935216 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal"] Apr 23 13:31:29.935416 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.935342 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:29.936410 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.936387 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:29.936511 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.936419 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:29.936511 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.936429 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:29.937870 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.937857 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:29.938007 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.937991 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.938048 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.938022 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:29.939287 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.939268 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:29.939380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.939276 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:29.939380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.939300 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:29.939380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.939314 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:29.939380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.939318 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:29.939380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.939330 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:29.940462 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.940446 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.940549 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.940473 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:29.941380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.941362 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:29.941458 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.941395 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:29.941458 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:29.941410 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:29.969325 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.969306 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-187.ec2.internal\" not found" node="ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.973065 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.973049 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-187.ec2.internal\" not found" node="ip-10-0-137-187.ec2.internal" Apr 23 13:31:29.999859 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:29.999836 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.015330 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.015309 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/08bb44b3b8944f3166abf4dcee6b9b11-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal\" (UID: \"08bb44b3b8944f3166abf4dcee6b9b11\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.015401 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.015335 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08bb44b3b8944f3166abf4dcee6b9b11-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal\" (UID: \"08bb44b3b8944f3166abf4dcee6b9b11\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.015401 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.015352 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43d3b8ea7119a14ffb4ca124c24a14eb-config\") pod \"kube-apiserver-proxy-ip-10-0-137-187.ec2.internal\" (UID: \"43d3b8ea7119a14ffb4ca124c24a14eb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.100550 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.100517 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.115971 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.115937 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/08bb44b3b8944f3166abf4dcee6b9b11-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal\" (UID: \"08bb44b3b8944f3166abf4dcee6b9b11\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.115971 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.115975 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08bb44b3b8944f3166abf4dcee6b9b11-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal\" (UID: \"08bb44b3b8944f3166abf4dcee6b9b11\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.116090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.115993 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43d3b8ea7119a14ffb4ca124c24a14eb-config\") pod \"kube-apiserver-proxy-ip-10-0-137-187.ec2.internal\" (UID: \"43d3b8ea7119a14ffb4ca124c24a14eb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.116090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.116034 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43d3b8ea7119a14ffb4ca124c24a14eb-config\") pod \"kube-apiserver-proxy-ip-10-0-137-187.ec2.internal\" (UID: \"43d3b8ea7119a14ffb4ca124c24a14eb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.116090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.116043 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08bb44b3b8944f3166abf4dcee6b9b11-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal\" (UID: \"08bb44b3b8944f3166abf4dcee6b9b11\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.116090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.116035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/08bb44b3b8944f3166abf4dcee6b9b11-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal\" (UID: \"08bb44b3b8944f3166abf4dcee6b9b11\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.201437 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.201373 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.270962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.270926 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.274804 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.274788 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.302342 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.302307 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.402887 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.402851 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.503466 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.503385 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.603925 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.603892 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-187.ec2.internal\" not found" Apr 23 13:31:30.618282 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.618257 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:31:30.618423 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.618408 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:30.654162 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.654137 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:30.679489 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.679465 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:30.689054 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.689032 2565 apiserver.go:52] "Watching apiserver" Apr 23 13:31:30.696301 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.696282 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:30.697468 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.697448 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mvwgw","openshift-multus/multus-b6gjz","openshift-network-operator/iptables-alerter-97wbq","openshift-cluster-node-tuning-operator/tuned-b789s","openshift-image-registry/node-ca-rjv7k","openshift-multus/network-metrics-daemon-gdstf","openshift-network-diagnostics/network-check-target-hclwj","openshift-ovn-kubernetes/ovnkube-node-vxhp2","kube-system/konnectivity-agent-6fv8j","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn"] Apr 23 13:31:30.699843 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.699826 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.701237 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.701217 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.702292 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.702271 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:30.702414 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.702359 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xc524\"" Apr 23 13:31:30.702478 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.702460 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:30.702531 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.702512 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:30.702594 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.702517 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:30.702644 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.702612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:30.702688 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.702607 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:30.702688 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.702678 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:30.703545 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.703527 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:30.703735 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.703716 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:30.703858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.703780 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5h8nj\"" Apr 23 13:31:30.703858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.703812 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.704584 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.704565 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:30.705023 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.705005 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.706289 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.706267 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pzgzn\"" Apr 23 13:31:30.706459 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.706438 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:30.706640 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.706551 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:30.706718 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.706686 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:30.706785 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.706686 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:30.706902 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.706884 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:30.707811 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.707300 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:30.707811 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.707715 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9whf8\"" Apr 23 13:31:30.708535 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.708519 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.709728 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.709712 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.710863 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.710847 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.710982 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.710959 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:30.711066 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.711012 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-prq5s\"" Apr 23 13:31:30.711066 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.711030 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:30.711984 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.711969 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:30.712044 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.711985 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.712184 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712170 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.712411 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712395 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:30.712521 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712505 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:30.712617 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:30.712675 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712634 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:30.712675 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712654 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5h8nb\"" Apr 23 13:31:30.712792 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.712683 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:30.713079 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.713061 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:30.713238 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.713222 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2frzk\"" Apr 23 13:31:30.713297 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.713268 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:30.713391 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.713375 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:30.713754 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.713737 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:30.714511 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.714496 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:30.715029 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.715014 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:30.715122 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.715099 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-d7plh\"" Apr 23 13:31:30.715283 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.715270 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:30.719584 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719565 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-netns\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.719655 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-multus-certs\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.719655 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719607 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysctl-d\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.719655 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719623 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tt86\" (UniqueName: \"kubernetes.io/projected/a47ff253-1704-447a-b1cd-4a1b12019c92-kube-api-access-9tt86\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.719655 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719645 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-run\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.719798 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719666 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-var-lib-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.719798 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719693 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-cni-netd\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.719798 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719727 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/676c8632-4468-4e42-b6fb-2a866baddda7-iptables-alerter-script\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.719798 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719751 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-os-release\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.719910 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719802 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.719910 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719829 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-cni-bin\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.719910 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719857 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a47ff253-1704-447a-b1cd-4a1b12019c92-ovn-node-metrics-cert\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.719910 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719879 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-socket-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.719910 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719897 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhvb\" (UniqueName: \"kubernetes.io/projected/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-kube-api-access-hwhvb\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719913 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-kubernetes\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d604becf-afb4-4b3f-aaec-3618178f4dfe-tmp\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719949 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z2gz\" (UniqueName: \"kubernetes.io/projected/d604becf-afb4-4b3f-aaec-3618178f4dfe-kube-api-access-6z2gz\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719972 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zcj5\" (UniqueName: \"kubernetes.io/projected/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-kube-api-access-4zcj5\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.719988 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-system-cni-dir\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720002 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-cni-multus\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-etc-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-host\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720051 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wgg\" (UniqueName: \"kubernetes.io/projected/967ed5b3-0337-40d9-872d-aa7a02b7c552-kube-api-access-s7wgg\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720066 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-env-overrides\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720079 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720092 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-lib-modules\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720107 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-os-release\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720121 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysctl-conf\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720151 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-etc-selinux\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720186 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-cnibin\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-systemd\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720225 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-cnibin\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720239 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-ovn\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720254 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-ovnkube-config\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720283 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x8g8\" (UniqueName: \"kubernetes.io/projected/676c8632-4468-4e42-b6fb-2a866baddda7-kube-api-access-2x8g8\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720302 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-daemon-config\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720315 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-serviceca\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-ovnkube-script-lib\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720359 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-registration-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720395 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720429 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-kubelet\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720454 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-hostroot\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-conf-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720506 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720500 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-systemd-units\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720526 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-slash\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-log-socket\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720587 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720622 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-cni-bin\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720646 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf4eb16e-4919-47aa-9bb2-0f615778f26d-konnectivity-ca\") pod \"konnectivity-agent-6fv8j\" (UID: \"bf4eb16e-4919-47aa-9bb2-0f615778f26d\") " pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720700 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-device-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720774 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/676c8632-4468-4e42-b6fb-2a866baddda7-host-slash\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720802 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-system-cni-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720839 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-k8s-cni-cncf-io\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720862 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-etc-kubernetes\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720888 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8gn\" (UniqueName: \"kubernetes.io/projected/1c553f28-0c89-4983-b30b-c0bdd06b63e6-kube-api-access-6s8gn\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720912 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-sys\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720934 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-var-lib-kubelet\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.720999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720956 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-tuned\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.720980 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-systemd\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721008 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-node-log\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721031 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-sys-fs\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721082 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglfs\" (UniqueName: \"kubernetes.io/projected/43f57458-7ecc-4c9f-8890-521f1a9776af-kube-api-access-hglfs\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721134 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-modprobe-d\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721165 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysconfig\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf4eb16e-4919-47aa-9bb2-0f615778f26d-agent-certs\") pod \"konnectivity-agent-6fv8j\" (UID: \"bf4eb16e-4919-47aa-9bb2-0f615778f26d\") " pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721230 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-kubelet\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721278 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-run-netns\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721299 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c553f28-0c89-4983-b30b-c0bdd06b63e6-cni-binary-copy\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721314 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-host\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721332 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721348 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721363 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-run-ovn-kubernetes\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.721555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-cni-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.722174 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.721390 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-socket-dir-parent\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.728453 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.728432 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:30.728630 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.728616 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:30.728708 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.728695 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" Apr 23 13:31:30.728804 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.728787 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal"] Apr 23 13:31:30.733394 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.733367 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:26:29 +0000 UTC" deadline="2027-10-11 12:28:11.056527167 +0000 UTC" Apr 23 13:31:30.733394 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.733394 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12862h56m40.323136301s" Apr 23 13:31:30.734124 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:30.734097 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d3b8ea7119a14ffb4ca124c24a14eb.slice/crio-5806c57b8487b46a703098dbcc7fc0154ba0268a1392905fd1e8b79573fe5fe9 WatchSource:0}: Error finding container 5806c57b8487b46a703098dbcc7fc0154ba0268a1392905fd1e8b79573fe5fe9: Status 404 returned error can't find the container with id 5806c57b8487b46a703098dbcc7fc0154ba0268a1392905fd1e8b79573fe5fe9 Apr 23 13:31:30.734458 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:30.734434 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08bb44b3b8944f3166abf4dcee6b9b11.slice/crio-80961102b11c56eaeb36bcfaa6f43c7125716e069651c745efccc32ac80d7f33 WatchSource:0}: Error finding container 80961102b11c56eaeb36bcfaa6f43c7125716e069651c745efccc32ac80d7f33: Status 404 returned error can't find the container with id 80961102b11c56eaeb36bcfaa6f43c7125716e069651c745efccc32ac80d7f33 Apr 23 13:31:30.735250 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.735227 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal"] Apr 23 13:31:30.735334 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.735284 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:30.738601 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.738586 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:31:30.746176 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.746155 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jncrc" Apr 23 13:31:30.754436 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.754376 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jncrc" Apr 23 13:31:30.821895 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.821862 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-ovn\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.821895 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.821891 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-ovnkube-config\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.821910 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x8g8\" (UniqueName: \"kubernetes.io/projected/676c8632-4468-4e42-b6fb-2a866baddda7-kube-api-access-2x8g8\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.821954 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-ovn\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.821991 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-daemon-config\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822014 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-serviceca\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822030 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-ovnkube-script-lib\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-registration-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822090 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-kubelet\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822114 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-hostroot\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822138 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-conf-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822169 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-hostroot\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822160 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-systemd-units\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-slash\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822221 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-systemd-units\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822240 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-log-socket\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822263 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-registration-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822280 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822292 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-slash\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-log-socket\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822323 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-cni-bin\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822249 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-conf-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf4eb16e-4919-47aa-9bb2-0f615778f26d-konnectivity-ca\") pod \"konnectivity-agent-6fv8j\" (UID: \"bf4eb16e-4919-47aa-9bb2-0f615778f26d\") " pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822404 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.822421 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822432 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-kubelet\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.822529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-cni-bin\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.822536 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:31.32251604 +0000 UTC m=+2.061059986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822561 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-serviceca\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-device-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822651 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822678 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-device-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822682 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/676c8632-4468-4e42-b6fb-2a866baddda7-host-slash\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822713 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-system-cni-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822738 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-k8s-cni-cncf-io\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822793 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/676c8632-4468-4e42-b6fb-2a866baddda7-host-slash\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822801 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-k8s-cni-cncf-io\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822827 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-system-cni-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822827 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-daemon-config\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822850 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-etc-kubernetes\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822878 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8gn\" (UniqueName: \"kubernetes.io/projected/1c553f28-0c89-4983-b30b-c0bdd06b63e6-kube-api-access-6s8gn\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822937 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-etc-kubernetes\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.823323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822975 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-sys\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.822991 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf4eb16e-4919-47aa-9bb2-0f615778f26d-konnectivity-ca\") pod \"konnectivity-agent-6fv8j\" (UID: \"bf4eb16e-4919-47aa-9bb2-0f615778f26d\") " pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823003 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-var-lib-kubelet\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-tuned\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823050 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-systemd\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823053 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-sys\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823095 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-systemd\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823093 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823104 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-var-lib-kubelet\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823128 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-node-log\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823157 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823163 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-ovnkube-script-lib\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823183 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-sys-fs\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823206 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-node-log\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823208 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hglfs\" (UniqueName: \"kubernetes.io/projected/43f57458-7ecc-4c9f-8890-521f1a9776af-kube-api-access-hglfs\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-modprobe-d\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823226 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.824055 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-sys-fs\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823262 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysconfig\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823303 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysconfig\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823360 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf4eb16e-4919-47aa-9bb2-0f615778f26d-agent-certs\") pod \"konnectivity-agent-6fv8j\" (UID: \"bf4eb16e-4919-47aa-9bb2-0f615778f26d\") " pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823365 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-modprobe-d\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823356 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823385 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-kubelet\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823402 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-ovnkube-config\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823430 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-run-netns\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823455 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c553f28-0c89-4983-b30b-c0bdd06b63e6-cni-binary-copy\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823456 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-kubelet\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823476 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-run-netns\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-host\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-host\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823525 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823554 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823582 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-run-ovn-kubernetes\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823607 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-cni-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.824858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823632 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-socket-dir-parent\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-run-ovn-kubernetes\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-netns\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-cni-dir\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-multus-certs\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823741 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysctl-d\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823742 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-multus-socket-dir-parent\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tt86\" (UniqueName: \"kubernetes.io/projected/a47ff253-1704-447a-b1cd-4a1b12019c92-kube-api-access-9tt86\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-run\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823810 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-multus-certs\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823837 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-run-netns\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823855 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-var-lib-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-cni-netd\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/676c8632-4468-4e42-b6fb-2a866baddda7-iptables-alerter-script\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823945 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c553f28-0c89-4983-b30b-c0bdd06b63e6-cni-binary-copy\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.823934 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-os-release\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824000 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824033 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-cni-netd\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.825501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824038 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-os-release\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824038 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-cni-bin\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824074 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-host-cni-bin\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824072 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824086 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a47ff253-1704-447a-b1cd-4a1b12019c92-ovn-node-metrics-cert\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824114 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-var-lib-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824117 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/967ed5b3-0337-40d9-872d-aa7a02b7c552-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824136 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysctl-d\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824160 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-socket-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824351 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhvb\" (UniqueName: \"kubernetes.io/projected/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-kube-api-access-hwhvb\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824369 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-kubernetes\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824388 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d604becf-afb4-4b3f-aaec-3618178f4dfe-tmp\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824202 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824203 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-run\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824412 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z2gz\" (UniqueName: \"kubernetes.io/projected/d604becf-afb4-4b3f-aaec-3618178f4dfe-kube-api-access-6z2gz\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824456 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zcj5\" (UniqueName: \"kubernetes.io/projected/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-kube-api-access-4zcj5\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824486 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-system-cni-dir\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824516 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-cni-multus\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824544 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-etc-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-host\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824568 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/676c8632-4468-4e42-b6fb-2a866baddda7-iptables-alerter-script\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-socket-dir\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wgg\" (UniqueName: \"kubernetes.io/projected/967ed5b3-0337-40d9-872d-aa7a02b7c552-kube-api-access-s7wgg\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824619 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-env-overrides\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824630 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-host-var-lib-cni-multus\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824639 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-system-cni-dir\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824687 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-lib-modules\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-kubernetes\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824656 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-host\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824725 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-os-release\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824784 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysctl-conf\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824790 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-etc-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824797 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-lib-modules\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.826509 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824825 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-etc-selinux\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824850 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-os-release\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824853 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-cnibin\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824884 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-systemd\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824906 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c553f28-0c89-4983-b30b-c0bdd06b63e6-cnibin\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-cnibin\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824972 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/967ed5b3-0337-40d9-872d-aa7a02b7c552-cnibin\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.824960 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a47ff253-1704-447a-b1cd-4a1b12019c92-run-openvswitch\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.825000 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a47ff253-1704-447a-b1cd-4a1b12019c92-env-overrides\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.825028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/43f57458-7ecc-4c9f-8890-521f1a9776af-etc-selinux\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.825047 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-systemd\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.825065 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-sysctl-conf\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.826532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d604becf-afb4-4b3f-aaec-3618178f4dfe-etc-tuned\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.826725 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a47ff253-1704-447a-b1cd-4a1b12019c92-ovn-node-metrics-cert\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.826856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf4eb16e-4919-47aa-9bb2-0f615778f26d-agent-certs\") pod \"konnectivity-agent-6fv8j\" (UID: \"bf4eb16e-4919-47aa-9bb2-0f615778f26d\") " pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:30.827090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.826869 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d604becf-afb4-4b3f-aaec-3618178f4dfe-tmp\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.830413 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.830396 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:30.830470 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.830418 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:30.830470 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.830431 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hxqlf for pod openshift-network-diagnostics/network-check-target-hclwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:30.830543 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:30.830500 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf podName:d6413ec2-e315-417e-9b7d-ce057e4f10a3 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:31.330484146 +0000 UTC m=+2.069028096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hxqlf" (UniqueName: "kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf") pod "network-check-target-hclwj" (UID: "d6413ec2-e315-417e-9b7d-ce057e4f10a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:30.832157 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.832141 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x8g8\" (UniqueName: \"kubernetes.io/projected/676c8632-4468-4e42-b6fb-2a866baddda7-kube-api-access-2x8g8\") pod \"iptables-alerter-97wbq\" (UID: \"676c8632-4468-4e42-b6fb-2a866baddda7\") " pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:30.837323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.837137 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zcj5\" (UniqueName: \"kubernetes.io/projected/3a6f5afc-ae97-4be4-ad1c-c3af1a35a586-kube-api-access-4zcj5\") pod \"node-ca-rjv7k\" (UID: \"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586\") " pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:30.837323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.837283 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglfs\" (UniqueName: \"kubernetes.io/projected/43f57458-7ecc-4c9f-8890-521f1a9776af-kube-api-access-hglfs\") pod \"aws-ebs-csi-driver-node-fjkrn\" (UID: \"43f57458-7ecc-4c9f-8890-521f1a9776af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:30.837323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.837292 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z2gz\" (UniqueName: \"kubernetes.io/projected/d604becf-afb4-4b3f-aaec-3618178f4dfe-kube-api-access-6z2gz\") pod \"tuned-b789s\" (UID: \"d604becf-afb4-4b3f-aaec-3618178f4dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:30.837538 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.837523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhvb\" (UniqueName: \"kubernetes.io/projected/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-kube-api-access-hwhvb\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:30.837812 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.837791 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tt86\" (UniqueName: \"kubernetes.io/projected/a47ff253-1704-447a-b1cd-4a1b12019c92-kube-api-access-9tt86\") pod \"ovnkube-node-vxhp2\" (UID: \"a47ff253-1704-447a-b1cd-4a1b12019c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:30.837886 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.837813 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wgg\" (UniqueName: \"kubernetes.io/projected/967ed5b3-0337-40d9-872d-aa7a02b7c552-kube-api-access-s7wgg\") pod \"multus-additional-cni-plugins-mvwgw\" (UID: \"967ed5b3-0337-40d9-872d-aa7a02b7c552\") " pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:30.838398 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.838378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8gn\" (UniqueName: \"kubernetes.io/projected/1c553f28-0c89-4983-b30b-c0bdd06b63e6-kube-api-access-6s8gn\") pod \"multus-b6gjz\" (UID: \"1c553f28-0c89-4983-b30b-c0bdd06b63e6\") " pod="openshift-multus/multus-b6gjz" Apr 23 13:31:30.838824 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.838779 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" event={"ID":"08bb44b3b8944f3166abf4dcee6b9b11","Type":"ContainerStarted","Data":"80961102b11c56eaeb36bcfaa6f43c7125716e069651c745efccc32ac80d7f33"} Apr 23 13:31:30.839792 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:30.839774 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" event={"ID":"43d3b8ea7119a14ffb4ca124c24a14eb","Type":"ContainerStarted","Data":"5806c57b8487b46a703098dbcc7fc0154ba0268a1392905fd1e8b79573fe5fe9"} Apr 23 13:31:31.032128 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.032040 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-97wbq" Apr 23 13:31:31.036657 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.036643 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rjv7k" Apr 23 13:31:31.038504 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.038475 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676c8632_4468_4e42_b6fb_2a866baddda7.slice/crio-793ae33fdac7924277f1efbabe5c4cc657dd1ef6f8eb121a7e19c231c9e089ed WatchSource:0}: Error finding container 793ae33fdac7924277f1efbabe5c4cc657dd1ef6f8eb121a7e19c231c9e089ed: Status 404 returned error can't find the container with id 793ae33fdac7924277f1efbabe5c4cc657dd1ef6f8eb121a7e19c231c9e089ed Apr 23 13:31:31.040020 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.039983 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:31.043173 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.043150 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a6f5afc_ae97_4be4_ad1c_c3af1a35a586.slice/crio-2a8aebb7a9cd53852478f4f6766637d0f79e2bcaef457919c402c3dfc7c8a5d8 WatchSource:0}: Error finding container 2a8aebb7a9cd53852478f4f6766637d0f79e2bcaef457919c402c3dfc7c8a5d8: Status 404 returned error can't find the container with id 2a8aebb7a9cd53852478f4f6766637d0f79e2bcaef457919c402c3dfc7c8a5d8 Apr 23 13:31:31.044296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.044243 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" Apr 23 13:31:31.047867 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.047852 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b6gjz" Apr 23 13:31:31.050626 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.050602 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967ed5b3_0337_40d9_872d_aa7a02b7c552.slice/crio-972a960248c133c75cff21040d6c024edafee930667ad55709a98a9e7c82cbe8 WatchSource:0}: Error finding container 972a960248c133c75cff21040d6c024edafee930667ad55709a98a9e7c82cbe8: Status 404 returned error can't find the container with id 972a960248c133c75cff21040d6c024edafee930667ad55709a98a9e7c82cbe8 Apr 23 13:31:31.054213 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.054195 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b789s" Apr 23 13:31:31.054439 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.054413 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c553f28_0c89_4983_b30b_c0bdd06b63e6.slice/crio-6f3a4dcdf0d87badc384dc0cb64b0216bce0c7030649becaf2388bb2de9a3cb8 WatchSource:0}: Error finding container 6f3a4dcdf0d87badc384dc0cb64b0216bce0c7030649becaf2388bb2de9a3cb8: Status 404 returned error can't find the container with id 6f3a4dcdf0d87badc384dc0cb64b0216bce0c7030649becaf2388bb2de9a3cb8 Apr 23 13:31:31.060327 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.060296 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:31.060490 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.060473 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd604becf_afb4_4b3f_aaec_3618178f4dfe.slice/crio-efe847bd16a8be871747fb2ea9e02c1f9f0e01d7ecc09516d1d877d02d1e4208 WatchSource:0}: Error finding container efe847bd16a8be871747fb2ea9e02c1f9f0e01d7ecc09516d1d877d02d1e4208: Status 404 returned error can't find the container with id efe847bd16a8be871747fb2ea9e02c1f9f0e01d7ecc09516d1d877d02d1e4208 Apr 23 13:31:31.066386 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.066372 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:31.066598 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.066573 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda47ff253_1704_447a_b1cd_4a1b12019c92.slice/crio-0282381e60fece9778daab8789b117fd450228436e1a4e11dd3566c000916095 WatchSource:0}: Error finding container 0282381e60fece9778daab8789b117fd450228436e1a4e11dd3566c000916095: Status 404 returned error can't find the container with id 0282381e60fece9778daab8789b117fd450228436e1a4e11dd3566c000916095 Apr 23 13:31:31.072307 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.072286 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4eb16e_4919_47aa_9bb2_0f615778f26d.slice/crio-de29fe7a4c00c616e3a921dcaf8edddf936f00dd0740930854a8df2fbcc5768a WatchSource:0}: Error finding container de29fe7a4c00c616e3a921dcaf8edddf936f00dd0740930854a8df2fbcc5768a: Status 404 returned error can't find the container with id de29fe7a4c00c616e3a921dcaf8edddf936f00dd0740930854a8df2fbcc5768a Apr 23 13:31:31.074933 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.074913 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" Apr 23 13:31:31.083630 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:31.083538 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f57458_7ecc_4c9f_8890_521f1a9776af.slice/crio-8d7ba547a4cf397e768c901a562b09840e533e094020cf9ef7eb564321bd4ea8 WatchSource:0}: Error finding container 8d7ba547a4cf397e768c901a562b09840e533e094020cf9ef7eb564321bd4ea8: Status 404 returned error can't find the container with id 8d7ba547a4cf397e768c901a562b09840e533e094020cf9ef7eb564321bd4ea8 Apr 23 13:31:31.328083 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.327994 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:31.328239 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:31.328163 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:31.328239 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:31.328231 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.328211428 +0000 UTC m=+3.066755386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:31.428400 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.428363 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:31.428554 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:31.428534 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:31.428613 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:31.428560 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:31.428613 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:31.428573 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hxqlf for pod openshift-network-diagnostics/network-check-target-hclwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:31.428714 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:31.428629 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf podName:d6413ec2-e315-417e-9b7d-ce057e4f10a3 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.428611059 +0000 UTC m=+3.167155016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxqlf" (UniqueName: "kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf") pod "network-check-target-hclwj" (UID: "d6413ec2-e315-417e-9b7d-ce057e4f10a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:31.481291 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.481263 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:31.755560 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.755516 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:30 +0000 UTC" deadline="2027-10-10 23:09:22.512604814 +0000 UTC" Apr 23 13:31:31.755560 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.755556 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12849h37m50.757052126s" Apr 23 13:31:31.847816 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.847724 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"0282381e60fece9778daab8789b117fd450228436e1a4e11dd3566c000916095"} Apr 23 13:31:31.853163 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.853129 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b789s" event={"ID":"d604becf-afb4-4b3f-aaec-3618178f4dfe","Type":"ContainerStarted","Data":"efe847bd16a8be871747fb2ea9e02c1f9f0e01d7ecc09516d1d877d02d1e4208"} Apr 23 13:31:31.864084 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.864012 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b6gjz" event={"ID":"1c553f28-0c89-4983-b30b-c0bdd06b63e6","Type":"ContainerStarted","Data":"6f3a4dcdf0d87badc384dc0cb64b0216bce0c7030649becaf2388bb2de9a3cb8"} Apr 23 13:31:31.867915 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.867884 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerStarted","Data":"972a960248c133c75cff21040d6c024edafee930667ad55709a98a9e7c82cbe8"} Apr 23 13:31:31.869986 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.869960 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-97wbq" event={"ID":"676c8632-4468-4e42-b6fb-2a866baddda7","Type":"ContainerStarted","Data":"793ae33fdac7924277f1efbabe5c4cc657dd1ef6f8eb121a7e19c231c9e089ed"} Apr 23 13:31:31.873724 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.873697 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" event={"ID":"43f57458-7ecc-4c9f-8890-521f1a9776af","Type":"ContainerStarted","Data":"8d7ba547a4cf397e768c901a562b09840e533e094020cf9ef7eb564321bd4ea8"} Apr 23 13:31:31.884973 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.884941 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6fv8j" event={"ID":"bf4eb16e-4919-47aa-9bb2-0f615778f26d","Type":"ContainerStarted","Data":"de29fe7a4c00c616e3a921dcaf8edddf936f00dd0740930854a8df2fbcc5768a"} Apr 23 13:31:31.887887 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:31.887845 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rjv7k" event={"ID":"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586","Type":"ContainerStarted","Data":"2a8aebb7a9cd53852478f4f6766637d0f79e2bcaef457919c402c3dfc7c8a5d8"} Apr 23 13:31:32.335901 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:32.335246 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:32.335901 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.335416 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:32.335901 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.335478 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:34.335459541 +0000 UTC m=+5.074003504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:32.436544 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:32.436508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:32.436703 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.436686 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:32.436800 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.436705 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:32.436800 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.436718 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hxqlf for pod openshift-network-diagnostics/network-check-target-hclwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:32.436800 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.436793 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf podName:d6413ec2-e315-417e-9b7d-ce057e4f10a3 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:34.436774686 +0000 UTC m=+5.175318645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxqlf" (UniqueName: "kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf") pod "network-check-target-hclwj" (UID: "d6413ec2-e315-417e-9b7d-ce057e4f10a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:32.756809 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:32.756703 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:30 +0000 UTC" deadline="2027-11-30 08:25:58.766344573 +0000 UTC" Apr 23 13:31:32.756809 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:32.756745 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14058h54m26.009603317s" Apr 23 13:31:32.835961 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:32.835929 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:32.836129 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.836056 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:32.836200 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:32.836163 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:32.836323 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:32.836299 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:34.351204 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:34.351167 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:34.351644 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.351325 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:34.351644 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.351404 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:38.351384482 +0000 UTC m=+9.089928430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:34.452151 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:34.452113 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:34.452323 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.452253 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:34.452323 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.452277 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:34.452323 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.452289 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hxqlf for pod openshift-network-diagnostics/network-check-target-hclwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:34.452476 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.452341 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf podName:d6413ec2-e315-417e-9b7d-ce057e4f10a3 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:38.452322511 +0000 UTC m=+9.190866455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxqlf" (UniqueName: "kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf") pod "network-check-target-hclwj" (UID: "d6413ec2-e315-417e-9b7d-ce057e4f10a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:34.836289 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:34.836205 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:34.836453 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.836336 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:34.836453 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:34.836438 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:34.836551 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:34.836522 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:36.836263 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:36.836225 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:36.836674 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:36.836369 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:36.836674 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:36.836225 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:36.836674 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:36.836537 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:37.848198 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.848069 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ptldl"] Apr 23 13:31:37.849933 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.849909 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.850068 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:37.849993 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:37.882668 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.882634 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1344ba49-27f1-41a6-94d2-2e85595b528d-dbus\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.882849 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.882679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.882849 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.882734 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1344ba49-27f1-41a6-94d2-2e85595b528d-kubelet-config\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.983124 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.983092 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1344ba49-27f1-41a6-94d2-2e85595b528d-kubelet-config\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.983281 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.983172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1344ba49-27f1-41a6-94d2-2e85595b528d-dbus\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.983281 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.983198 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.983281 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.983223 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1344ba49-27f1-41a6-94d2-2e85595b528d-kubelet-config\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:37.983447 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:37.983299 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:37.983447 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:37.983352 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret podName:1344ba49-27f1-41a6-94d2-2e85595b528d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:38.483334354 +0000 UTC m=+9.221878298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret") pod "global-pull-secret-syncer-ptldl" (UID: "1344ba49-27f1-41a6-94d2-2e85595b528d") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:37.983447 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:37.983368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1344ba49-27f1-41a6-94d2-2e85595b528d-dbus\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:38.386980 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:38.386942 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:38.387168 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.387102 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:38.387229 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.387169 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:46.387152689 +0000 UTC m=+17.125696637 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:38.488346 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:38.488296 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:38.488515 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:38.488370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:38.488578 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.488516 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:38.488578 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.488532 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:38.488578 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.488545 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hxqlf for pod openshift-network-diagnostics/network-check-target-hclwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:38.488741 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.488600 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf podName:d6413ec2-e315-417e-9b7d-ce057e4f10a3 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:46.488583518 +0000 UTC m=+17.227127488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxqlf" (UniqueName: "kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf") pod "network-check-target-hclwj" (UID: "d6413ec2-e315-417e-9b7d-ce057e4f10a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:38.489113 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.488976 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:38.489113 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.489047 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret podName:1344ba49-27f1-41a6-94d2-2e85595b528d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.489030784 +0000 UTC m=+10.227574743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret") pod "global-pull-secret-syncer-ptldl" (UID: "1344ba49-27f1-41a6-94d2-2e85595b528d") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:38.836422 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:38.835730 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:38.836422 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.835867 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:38.836422 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:38.835957 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:38.836422 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:38.836047 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:39.497030 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:39.496993 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:39.497454 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:39.497164 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:39.497454 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:39.497228 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret podName:1344ba49-27f1-41a6-94d2-2e85595b528d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:41.497210717 +0000 UTC m=+12.235754667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret") pod "global-pull-secret-syncer-ptldl" (UID: "1344ba49-27f1-41a6-94d2-2e85595b528d") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:39.836712 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:39.836303 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:39.836712 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:39.836435 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:40.835561 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:40.835478 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:40.835962 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:40.835585 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:40.835962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:40.835652 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:40.835962 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:40.835790 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:41.514154 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:41.514113 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:41.514327 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:41.514265 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:41.514371 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:41.514336 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret podName:1344ba49-27f1-41a6-94d2-2e85595b528d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:45.514320775 +0000 UTC m=+16.252864722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret") pod "global-pull-secret-syncer-ptldl" (UID: "1344ba49-27f1-41a6-94d2-2e85595b528d") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:41.835989 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:41.835914 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:41.836371 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:41.836049 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:42.835232 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:42.835203 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:42.835398 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:42.835293 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:42.835452 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:42.835406 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:42.835581 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:42.835562 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:43.835578 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:43.835542 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:43.836046 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:43.835650 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:44.835621 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:44.835587 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:44.836034 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:44.835632 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:44.836034 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:44.835743 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:44.836034 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:44.835878 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:45.548932 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:45.548890 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:45.549095 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:45.549049 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:45.549155 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:45.549115 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret podName:1344ba49-27f1-41a6-94d2-2e85595b528d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:53.549094942 +0000 UTC m=+24.287638886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret") pod "global-pull-secret-syncer-ptldl" (UID: "1344ba49-27f1-41a6-94d2-2e85595b528d") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:45.836010 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:45.835934 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:45.836428 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:45.836049 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:46.455954 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:46.455919 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:46.456176 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.456072 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:46.456176 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.456148 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.456126578 +0000 UTC m=+33.194670526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:46.557085 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:46.557030 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:46.557247 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.557219 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:46.557247 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.557244 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:46.557347 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.557257 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hxqlf for pod openshift-network-diagnostics/network-check-target-hclwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:46.557347 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.557318 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf podName:d6413ec2-e315-417e-9b7d-ce057e4f10a3 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.557299549 +0000 UTC m=+33.295843497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxqlf" (UniqueName: "kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf") pod "network-check-target-hclwj" (UID: "d6413ec2-e315-417e-9b7d-ce057e4f10a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:46.836183 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:46.836105 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:46.836613 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:46.836109 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:46.836613 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.836219 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:46.836613 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:46.836327 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:47.836094 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:47.836052 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:47.836285 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:47.836192 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:48.835803 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.835540 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:48.835907 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.835540 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:48.835973 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:48.835903 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:48.835973 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:48.835939 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:48.922690 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.922597 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b789s" event={"ID":"d604becf-afb4-4b3f-aaec-3618178f4dfe","Type":"ContainerStarted","Data":"0c83a15519609f68870461e7738dbf1e5abf3e0ab44e507bf2071b048083e895"} Apr 23 13:31:48.925121 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.924790 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b6gjz" event={"ID":"1c553f28-0c89-4983-b30b-c0bdd06b63e6","Type":"ContainerStarted","Data":"fd533f383ec96fc50907a9ff5f2d9e41bb271c8d77f13d0aaaa2bfbca25c844c"} Apr 23 13:31:48.930605 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.930531 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" event={"ID":"43d3b8ea7119a14ffb4ca124c24a14eb","Type":"ContainerStarted","Data":"361e04c854049faf705d21224d9e14389cff3b772e3fedfe3da41995ce0f9894"} Apr 23 13:31:48.933973 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.933947 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"ec48a8e58f9719891b09826a06a665b6bbf99a7b722c6e7c9b74bc407ff0823d"} Apr 23 13:31:48.934059 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.933982 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"e9ead8a74c9bec9b74ed600e1bc55cb1a9277df1ca0f1cf4878de97004ca75b4"} Apr 23 13:31:48.934059 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.933996 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"7f106eca3623ab3c84ccb0140e1e22713691255f4fb080da23f8717af6e43fc9"} Apr 23 13:31:48.934059 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.934008 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"436d2f8396ca20b4aadfc39cb8721078cf8a7283a556f1a90568242cc6bf0d20"} Apr 23 13:31:48.941734 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.941610 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b789s" podStartSLOduration=1.695430139 podStartE2EDuration="18.941597862s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.062145045 +0000 UTC m=+1.800688995" lastFinishedPulling="2026-04-23 13:31:48.30831276 +0000 UTC m=+19.046856718" observedRunningTime="2026-04-23 13:31:48.941393933 +0000 UTC m=+19.679937903" watchObservedRunningTime="2026-04-23 13:31:48.941597862 +0000 UTC m=+19.680141828" Apr 23 13:31:48.961461 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.961318 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b6gjz" podStartSLOduration=1.677565406 podStartE2EDuration="18.961305565s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.057311671 +0000 UTC m=+1.795855615" lastFinishedPulling="2026-04-23 13:31:48.341051817 +0000 UTC m=+19.079595774" observedRunningTime="2026-04-23 13:31:48.961176001 +0000 UTC m=+19.699719968" watchObservedRunningTime="2026-04-23 13:31:48.961305565 +0000 UTC m=+19.699849528" Apr 23 13:31:48.982298 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:48.982179 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-187.ec2.internal" podStartSLOduration=18.982157514 podStartE2EDuration="18.982157514s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:48.981942935 +0000 UTC m=+19.720486902" watchObservedRunningTime="2026-04-23 13:31:48.982157514 +0000 UTC m=+19.720701560" Apr 23 13:31:49.837038 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.837003 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:49.837186 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:49.837117 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:49.937420 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.937379 2565 generic.go:358] "Generic (PLEG): container finished" podID="967ed5b3-0337-40d9-872d-aa7a02b7c552" containerID="41d15312cd1e17f0ec705d104ff632992fba0cd8897ae60273a5a917be3a2fbb" exitCode=0 Apr 23 13:31:49.937897 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.937471 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerDied","Data":"41d15312cd1e17f0ec705d104ff632992fba0cd8897ae60273a5a917be3a2fbb"} Apr 23 13:31:49.939007 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.938943 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-97wbq" event={"ID":"676c8632-4468-4e42-b6fb-2a866baddda7","Type":"ContainerStarted","Data":"ba067ddf99955117d4e3d39e34dac8356ae053db7e379d5ccb623d6696fa59c1"} Apr 23 13:31:49.940353 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.940265 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" event={"ID":"43f57458-7ecc-4c9f-8890-521f1a9776af","Type":"ContainerStarted","Data":"168cba802d1507dbad7f10fa3ea4c4c62245756f3f0b498b0ed2257a5a13b30f"} Apr 23 13:31:49.941879 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.941851 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6fv8j" event={"ID":"bf4eb16e-4919-47aa-9bb2-0f615778f26d","Type":"ContainerStarted","Data":"e39e2871710b9ae1494eeb3f9f45662dbcb89b2482458cda4b9aed57e3f8a917"} Apr 23 13:31:49.943326 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.943304 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rjv7k" event={"ID":"3a6f5afc-ae97-4be4-ad1c-c3af1a35a586","Type":"ContainerStarted","Data":"2cfd50be06ef125a5d1d29c81c39246c2ed3faf6b8ffbe63c05bb23d29f1dc7c"} Apr 23 13:31:49.944959 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.944939 2565 generic.go:358] "Generic (PLEG): container finished" podID="08bb44b3b8944f3166abf4dcee6b9b11" containerID="26c9252cd1016259923cf33d746008f793b1b8588126f23dadd6ca4f3bf46f79" exitCode=0 Apr 23 13:31:49.945033 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.944995 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" event={"ID":"08bb44b3b8944f3166abf4dcee6b9b11","Type":"ContainerDied","Data":"26c9252cd1016259923cf33d746008f793b1b8588126f23dadd6ca4f3bf46f79"} Apr 23 13:31:49.948019 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.947996 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"d3de7d7aea37155f5c35aa2f6a6de08110375f7bdfb59318a85c928b1ff6a2e7"} Apr 23 13:31:49.948091 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.948027 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"bf0de2d54d6586daef74ebafc53d8b83537b2c0664b8dcc41bd607c505d32a98"} Apr 23 13:31:49.985733 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.985684 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-97wbq" podStartSLOduration=3.720172019 podStartE2EDuration="20.985668111s" podCreationTimestamp="2026-04-23 13:31:29 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.040324591 +0000 UTC m=+1.778868538" lastFinishedPulling="2026-04-23 13:31:48.305820679 +0000 UTC m=+19.044364630" observedRunningTime="2026-04-23 13:31:49.985629836 +0000 UTC m=+20.724173803" watchObservedRunningTime="2026-04-23 13:31:49.985668111 +0000 UTC m=+20.724212078" Apr 23 13:31:49.998040 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:49.997975 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rjv7k" podStartSLOduration=3.736560082 podStartE2EDuration="20.997963613s" podCreationTimestamp="2026-04-23 13:31:29 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.044634067 +0000 UTC m=+1.783178010" lastFinishedPulling="2026-04-23 13:31:48.306037593 +0000 UTC m=+19.044581541" observedRunningTime="2026-04-23 13:31:49.997414612 +0000 UTC m=+20.735958579" watchObservedRunningTime="2026-04-23 13:31:49.997963613 +0000 UTC m=+20.736507579" Apr 23 13:31:50.013016 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.012974 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6fv8j" podStartSLOduration=2.780711202 podStartE2EDuration="20.012946015s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.073672877 +0000 UTC m=+1.812216822" lastFinishedPulling="2026-04-23 13:31:48.305907674 +0000 UTC m=+19.044451635" observedRunningTime="2026-04-23 13:31:50.012588572 +0000 UTC m=+20.751132538" watchObservedRunningTime="2026-04-23 13:31:50.012946015 +0000 UTC m=+20.751489982" Apr 23 13:31:50.144969 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.144946 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:31:50.780845 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.780723 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:31:50.144967434Z","UUID":"c398c5e9-6fd7-43f9-95a8-0068aaae38a8","Handler":null,"Name":"","Endpoint":""} Apr 23 13:31:50.784078 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.784052 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:31:50.784078 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.784083 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:31:50.825233 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.825202 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w5s22"] Apr 23 13:31:50.827962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.827938 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.831116 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.830959 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kbm2f\"" Apr 23 13:31:50.831116 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.831059 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:50.831116 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.831064 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:50.835222 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.835199 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:50.835327 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:50.835303 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:50.835490 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.835425 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:50.835543 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:50.835524 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:50.886266 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.886235 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4976cf12-11ff-427a-a58d-9f126da4f625-tmp-dir\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.886452 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.886288 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzhq5\" (UniqueName: \"kubernetes.io/projected/4976cf12-11ff-427a-a58d-9f126da4f625-kube-api-access-kzhq5\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.886452 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.886324 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4976cf12-11ff-427a-a58d-9f126da4f625-hosts-file\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.952605 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.952367 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" event={"ID":"08bb44b3b8944f3166abf4dcee6b9b11","Type":"ContainerStarted","Data":"a47c47dad3b8a090fcd18ed55cc15d4a88c0c07e33d6e743694ac5dad3eae691"} Apr 23 13:31:50.955118 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.955089 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" event={"ID":"43f57458-7ecc-4c9f-8890-521f1a9776af","Type":"ContainerStarted","Data":"8cc8251385d97c97ba1190166172fa585d5a968591e15d7df2b02447d25b8611"} Apr 23 13:31:50.966058 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.966010 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-187.ec2.internal" podStartSLOduration=20.965993325 podStartE2EDuration="20.965993325s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:50.965738564 +0000 UTC m=+21.704282531" watchObservedRunningTime="2026-04-23 13:31:50.965993325 +0000 UTC m=+21.704537291" Apr 23 13:31:50.986959 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.986933 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4976cf12-11ff-427a-a58d-9f126da4f625-tmp-dir\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.987178 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.986971 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzhq5\" (UniqueName: \"kubernetes.io/projected/4976cf12-11ff-427a-a58d-9f126da4f625-kube-api-access-kzhq5\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.987178 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.986992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4976cf12-11ff-427a-a58d-9f126da4f625-hosts-file\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.987178 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.987059 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4976cf12-11ff-427a-a58d-9f126da4f625-hosts-file\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.987553 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.987510 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4976cf12-11ff-427a-a58d-9f126da4f625-tmp-dir\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:50.999059 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:50.999035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzhq5\" (UniqueName: \"kubernetes.io/projected/4976cf12-11ff-427a-a58d-9f126da4f625-kube-api-access-kzhq5\") pod \"node-resolver-w5s22\" (UID: \"4976cf12-11ff-427a-a58d-9f126da4f625\") " pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:51.138605 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.138282 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5s22" Apr 23 13:31:51.150378 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:31:51.150299 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4976cf12_11ff_427a_a58d_9f126da4f625.slice/crio-8eb1e61585408b5e3737488208b6ef6416ee2b2f80f7de699c7c5abc7a0a06ad WatchSource:0}: Error finding container 8eb1e61585408b5e3737488208b6ef6416ee2b2f80f7de699c7c5abc7a0a06ad: Status 404 returned error can't find the container with id 8eb1e61585408b5e3737488208b6ef6416ee2b2f80f7de699c7c5abc7a0a06ad Apr 23 13:31:51.835807 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.835774 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:51.836033 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:51.835869 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:51.958845 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.958808 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" event={"ID":"43f57458-7ecc-4c9f-8890-521f1a9776af","Type":"ContainerStarted","Data":"1cc6115d6dab4b4bbeeba463bd70b32a8bd8639729aa21bfe75cc18ad83e9b8e"} Apr 23 13:31:51.962367 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.962333 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"8bd9551249e89a3b4774836ada072bf12748b8f5216c25a1e7efc161f810a4aa"} Apr 23 13:31:51.963666 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.963639 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5s22" event={"ID":"4976cf12-11ff-427a-a58d-9f126da4f625","Type":"ContainerStarted","Data":"5617a5bc52f9263330ebcb75c4157664e3f67d49738a9237b7abf4aa8e9f11ed"} Apr 23 13:31:51.963810 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.963680 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5s22" event={"ID":"4976cf12-11ff-427a-a58d-9f126da4f625","Type":"ContainerStarted","Data":"8eb1e61585408b5e3737488208b6ef6416ee2b2f80f7de699c7c5abc7a0a06ad"} Apr 23 13:31:51.988408 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:51.988366 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjkrn" podStartSLOduration=2.0255574 podStartE2EDuration="21.988346544s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.085258113 +0000 UTC m=+1.823802065" lastFinishedPulling="2026-04-23 13:31:51.04804726 +0000 UTC m=+21.786591209" observedRunningTime="2026-04-23 13:31:51.98657521 +0000 UTC m=+22.725119173" watchObservedRunningTime="2026-04-23 13:31:51.988346544 +0000 UTC m=+22.726890561" Apr 23 13:31:52.001107 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.001065 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w5s22" podStartSLOduration=2.00105225 podStartE2EDuration="2.00105225s" podCreationTimestamp="2026-04-23 13:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:52.000555006 +0000 UTC m=+22.739098996" watchObservedRunningTime="2026-04-23 13:31:52.00105225 +0000 UTC m=+22.739596215" Apr 23 13:31:52.684961 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.684927 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:52.685727 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.685698 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:52.835918 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.835697 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:52.836090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.835697 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:52.836090 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:52.836043 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:52.836209 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:52.836172 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:52.966085 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.965975 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:52.966541 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:52.966355 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6fv8j" Apr 23 13:31:53.607980 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:53.607944 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:53.608168 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:53.608101 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:53.608168 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:53.608165 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret podName:1344ba49-27f1-41a6-94d2-2e85595b528d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.608148492 +0000 UTC m=+40.346692435 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret") pod "global-pull-secret-syncer-ptldl" (UID: "1344ba49-27f1-41a6-94d2-2e85595b528d") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:53.835726 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:53.835693 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:53.835903 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:53.835825 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:54.836344 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.836264 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:54.836733 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.836272 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:54.836733 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:54.836359 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:54.836733 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:54.836455 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:54.974302 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.974268 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" event={"ID":"a47ff253-1704-447a-b1cd-4a1b12019c92","Type":"ContainerStarted","Data":"d6f744beb03c17da3cef9e389a644d1b2cfc4ed5bc17ff0f69ffe9d50458e77c"} Apr 23 13:31:54.974640 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.974623 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:54.976018 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.975991 2565 generic.go:358] "Generic (PLEG): container finished" podID="967ed5b3-0337-40d9-872d-aa7a02b7c552" containerID="b6faefe42cb6b6345521b3fc9e45ec96cc4a07da8fa017534a951c18ed4c4d2e" exitCode=0 Apr 23 13:31:54.976139 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.976039 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerDied","Data":"b6faefe42cb6b6345521b3fc9e45ec96cc4a07da8fa017534a951c18ed4c4d2e"} Apr 23 13:31:54.989827 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:54.989809 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:55.017027 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.016990 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" podStartSLOduration=7.494389352 podStartE2EDuration="25.016978639s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.068285698 +0000 UTC m=+1.806829643" lastFinishedPulling="2026-04-23 13:31:48.590874983 +0000 UTC m=+19.329418930" observedRunningTime="2026-04-23 13:31:55.016741639 +0000 UTC m=+25.755285605" watchObservedRunningTime="2026-04-23 13:31:55.016978639 +0000 UTC m=+25.755522605" Apr 23 13:31:55.835905 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.835737 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:55.836048 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:55.835988 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:55.979140 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.979108 2565 generic.go:358] "Generic (PLEG): container finished" podID="967ed5b3-0337-40d9-872d-aa7a02b7c552" containerID="e4b7d16b381a583a2104ef7c5c623a508a16f420a1190af53304321ccdf38f44" exitCode=0 Apr 23 13:31:55.979603 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.979185 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerDied","Data":"e4b7d16b381a583a2104ef7c5c623a508a16f420a1190af53304321ccdf38f44"} Apr 23 13:31:55.979603 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.979595 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:55.979730 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.979620 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:55.995788 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:55.995750 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:31:56.182604 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.182555 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hclwj"] Apr 23 13:31:56.182751 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.182704 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:56.182824 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:56.182805 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:56.187314 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.187290 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ptldl"] Apr 23 13:31:56.187434 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.187401 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:56.187525 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:56.187499 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:56.190937 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.190914 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gdstf"] Apr 23 13:31:56.191045 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.191003 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:56.191098 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:56.191085 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:56.983270 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.983191 2565 generic.go:358] "Generic (PLEG): container finished" podID="967ed5b3-0337-40d9-872d-aa7a02b7c552" containerID="2870d9c6a5f0e79b8048cbb21bdda89da9cd4ed8d12c8f44d5071c4694bbb087" exitCode=0 Apr 23 13:31:56.983602 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:56.983277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerDied","Data":"2870d9c6a5f0e79b8048cbb21bdda89da9cd4ed8d12c8f44d5071c4694bbb087"} Apr 23 13:31:57.835607 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:57.835577 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:57.835607 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:57.835602 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:57.835847 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:57.835698 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:57.835847 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:57.835718 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:57.835847 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:57.835819 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:57.835995 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:57.835893 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:31:59.837255 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:59.837220 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:31:59.837988 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:59.837341 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:31:59.837988 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:59.837464 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hclwj" podUID="d6413ec2-e315-417e-9b7d-ce057e4f10a3" Apr 23 13:31:59.837988 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:59.837485 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdstf" podUID="6d6b50d4-32de-4031-b4e3-a88d3ce08d4d" Apr 23 13:31:59.837988 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:31:59.837887 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:31:59.837988 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:31:59.837987 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ptldl" podUID="1344ba49-27f1-41a6-94d2-2e85595b528d" Apr 23 13:32:01.576753 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.576569 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-187.ec2.internal" event="NodeReady" Apr 23 13:32:01.577153 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.576882 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:01.615799 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.615768 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8"] Apr 23 13:32:01.647404 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.646344 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rtfn8"] Apr 23 13:32:01.647404 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.646732 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:01.649737 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.649712 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-7pnvt\"" Apr 23 13:32:01.649873 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.649712 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 13:32:01.649942 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.649918 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.650295 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.650143 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.666227 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.666203 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj"] Apr 23 13:32:01.666389 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.666371 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.670750 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.670509 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 13:32:01.670750 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.670678 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 13:32:01.670945 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.670837 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.670945 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.670925 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.672524 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.672319 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-k2557\"" Apr 23 13:32:01.678051 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.678028 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 13:32:01.696772 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.696739 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s"] Apr 23 13:32:01.696909 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.696880 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.699774 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.699733 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 13:32:01.699899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.699743 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pxl5r\"" Apr 23 13:32:01.699899 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.699865 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.700130 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.700116 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 13:32:01.700192 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.700159 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.717916 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.717892 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697697656d-5zdsf"] Apr 23 13:32:01.718066 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.718049 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" Apr 23 13:32:01.721217 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.721184 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-cgw7t\"" Apr 23 13:32:01.721323 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.721275 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.721477 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.721452 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.743306 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.743274 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls"] Apr 23 13:32:01.743433 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.743403 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.746297 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.746163 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:32:01.746297 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.746245 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cq79g\"" Apr 23 13:32:01.746297 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.746266 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:32:01.746297 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.746270 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:32:01.751602 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.751584 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:32:01.762375 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.762355 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g"] Apr 23 13:32:01.762540 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.762524 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.765267 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.765239 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 13:32:01.765368 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.765293 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.765588 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.765566 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-ftzcm\"" Apr 23 13:32:01.765679 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.765614 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.766013 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.765992 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 13:32:01.774014 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.773992 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:01.774103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.774045 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34a5e8b5-8ca7-40e3-978f-439d854e09b0-config\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.774103 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.774089 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34a5e8b5-8ca7-40e3-978f-439d854e09b0-serving-cert\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.774219 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.774139 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnkwb\" (UniqueName: \"kubernetes.io/projected/34a5e8b5-8ca7-40e3-978f-439d854e09b0-kube-api-access-mnkwb\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.774219 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.774183 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn7hl\" (UniqueName: \"kubernetes.io/projected/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-kube-api-access-dn7hl\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:01.774219 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.774209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34a5e8b5-8ca7-40e3-978f-439d854e09b0-trusted-ca\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.782751 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.782733 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-c6hg5"] Apr 23 13:32:01.782901 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.782883 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.785560 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.785539 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qzn8t\"" Apr 23 13:32:01.785560 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.785551 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.785706 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.785604 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 13:32:01.786019 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.785869 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 13:32:01.786100 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.786016 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.799663 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.799643 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b994fd948-vpkgz"] Apr 23 13:32:01.799831 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.799808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.802237 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.802221 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 13:32:01.802734 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.802606 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.802734 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.802624 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.802734 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.802686 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 13:32:01.802734 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.802686 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-b7j5z\"" Apr 23 13:32:01.807320 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.807146 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 13:32:01.814898 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.813990 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8"] Apr 23 13:32:01.814898 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.814017 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s"] Apr 23 13:32:01.814898 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.814035 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls"] Apr 23 13:32:01.814898 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.814049 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-knl89"] Apr 23 13:32:01.814898 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.814156 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:01.816860 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.816844 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.816995 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.816967 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 13:32:01.817089 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.816936 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 13:32:01.817167 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.817123 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.817228 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.817174 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 13:32:01.817228 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.817178 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 13:32:01.817318 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.817281 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-qt8c2\"" Apr 23 13:32:01.833276 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.833212 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-c6hg5"] Apr 23 13:32:01.833276 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.833242 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj"] Apr 23 13:32:01.833276 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.833257 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rtfn8"] Apr 23 13:32:01.833276 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.833273 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vf74v"] Apr 23 13:32:01.833524 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.833353 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:01.836147 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.836125 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 13:32:01.836251 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.836231 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 13:32:01.836251 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.836244 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7dv25\"" Apr 23 13:32:01.849011 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.848984 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p"] Apr 23 13:32:01.849120 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.849104 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:32:01.849179 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.849123 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:32:01.849179 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.849130 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:01.849473 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.849456 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:32:01.851945 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.851927 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:01.852449 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852260 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:32:01.852449 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852262 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:01.852449 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852310 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:01.852449 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852353 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4t8gv\"" Apr 23 13:32:01.852449 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852277 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfgql\"" Apr 23 13:32:01.852449 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852391 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.852840 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852685 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.852840 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.852751 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6f6d\"" Apr 23 13:32:01.866888 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.866867 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" Apr 23 13:32:01.869582 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.869564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-bqt9b\"" Apr 23 13:32:01.872426 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.872407 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697697656d-5zdsf"] Apr 23 13:32:01.872530 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.872433 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fzqps"] Apr 23 13:32:01.874746 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874723 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn7hl\" (UniqueName: \"kubernetes.io/projected/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-kube-api-access-dn7hl\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:01.874845 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874774 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34a5e8b5-8ca7-40e3-978f-439d854e09b0-trusted-ca\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.874845 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874831 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.874929 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874881 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.874929 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874904 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-bound-sa-token\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.874999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874943 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:01.874999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874970 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-image-registry-private-configuration\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.875072 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.874999 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-certificates\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.875072 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2fed6c1-6174-4b2b-884a-12bca4486716-ca-trust-extracted\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.875072 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf6lt\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-kube-api-access-wf6lt\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.875271 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875091 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-installation-pull-secrets\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.875271 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.875121 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:32:01.875271 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875180 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506e9f6c-b41d-4cad-9333-d952e8630ef9-config\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.875271 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.875231 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls podName:e7d94bc3-8733-4bd5-b1de-635975dfe4bd nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.375199872 +0000 UTC m=+33.113743828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrmk8" (UID: "e7d94bc3-8733-4bd5-b1de-635975dfe4bd") : secret "samples-operator-tls" not found Apr 23 13:32:01.875271 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875266 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwczn\" (UniqueName: \"kubernetes.io/projected/506e9f6c-b41d-4cad-9333-d952e8630ef9-kube-api-access-dwczn\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875313 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875337 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875368 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875394 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhh4c\" (UniqueName: \"kubernetes.io/projected/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-kube-api-access-vhh4c\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875421 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-trusted-ca\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875454 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506e9f6c-b41d-4cad-9333-d952e8630ef9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.875522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8m77\" (UniqueName: \"kubernetes.io/projected/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-kube-api-access-t8m77\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.875867 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34a5e8b5-8ca7-40e3-978f-439d854e09b0-config\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.875867 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875552 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34a5e8b5-8ca7-40e3-978f-439d854e09b0-serving-cert\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.875867 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875577 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnkwb\" (UniqueName: \"kubernetes.io/projected/34a5e8b5-8ca7-40e3-978f-439d854e09b0-kube-api-access-mnkwb\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.875867 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.875621 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74vdb\" (UniqueName: \"kubernetes.io/projected/aebae10b-0ca7-46fd-860e-f45c0c031024-kube-api-access-74vdb\") pod \"volume-data-source-validator-7c6cbb6c87-4sh6s\" (UID: \"aebae10b-0ca7-46fd-860e-f45c0c031024\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" Apr 23 13:32:01.876084 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.876065 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34a5e8b5-8ca7-40e3-978f-439d854e09b0-trusted-ca\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.876151 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.876133 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34a5e8b5-8ca7-40e3-978f-439d854e09b0-config\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.881512 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.881489 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34a5e8b5-8ca7-40e3-978f-439d854e09b0-serving-cert\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.885112 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.885088 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn7hl\" (UniqueName: \"kubernetes.io/projected/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-kube-api-access-dn7hl\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:01.885504 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.885489 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnkwb\" (UniqueName: \"kubernetes.io/projected/34a5e8b5-8ca7-40e3-978f-439d854e09b0-kube-api-access-mnkwb\") pod \"console-operator-9d4b6777b-rtfn8\" (UID: \"34a5e8b5-8ca7-40e3-978f-439d854e09b0\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.888175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888132 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vf74v"] Apr 23 13:32:01.888175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888160 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g"] Apr 23 13:32:01.888175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888170 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p"] Apr 23 13:32:01.888175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888180 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-knl89"] Apr 23 13:32:01.888175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888193 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fzqps"] Apr 23 13:32:01.888175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888206 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b994fd948-vpkgz"] Apr 23 13:32:01.888496 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.888307 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:01.891531 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.891448 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:01.891531 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.891458 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:01.891531 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.891479 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4c4k9\"" Apr 23 13:32:01.891883 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.891813 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:01.976030 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.975995 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506e9f6c-b41d-4cad-9333-d952e8630ef9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.976188 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976043 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hlvn\" (UniqueName: \"kubernetes.io/projected/20030382-369a-4a7a-bdb3-477e6d873b00-kube-api-access-9hlvn\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:01.976188 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976063 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-stats-auth\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:01.976188 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976080 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:01.976188 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976101 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:01.976188 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976158 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.976403 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-bound-sa-token\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.976403 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.976258 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:01.976403 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.976279 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-697697656d-5zdsf: secret "image-registry-tls" not found Apr 23 13:32:01.976403 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976321 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-image-registry-private-configuration\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.976403 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.976352 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls podName:e2fed6c1-6174-4b2b-884a-12bca4486716 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.476328705 +0000 UTC m=+33.214872655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls") pod "image-registry-697697656d-5zdsf" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716") : secret "image-registry-tls" not found Apr 23 13:32:01.976403 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976390 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506e9f6c-b41d-4cad-9333-d952e8630ef9-config\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976419 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwczn\" (UniqueName: \"kubernetes.io/projected/506e9f6c-b41d-4cad-9333-d952e8630ef9-kube-api-access-dwczn\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwrz\" (UniqueName: \"kubernetes.io/projected/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-kube-api-access-jxwrz\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976504 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5880bb15-7341-40f4-a23b-983d2d71912f-tmp-dir\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976535 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2fed6c1-6174-4b2b-884a-12bca4486716-ca-trust-extracted\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976562 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf6lt\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-kube-api-access-wf6lt\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976590 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-default-certificate\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:01.976622 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976616 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-tmp\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976646 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-installation-pull-secrets\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976850 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5880bb15-7341-40f4-a23b-983d2d71912f-config-volume\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976918 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74vdb\" (UniqueName: \"kubernetes.io/projected/aebae10b-0ca7-46fd-860e-f45c0c031024-kube-api-access-74vdb\") pod \"volume-data-source-validator-7c6cbb6c87-4sh6s\" (UID: \"aebae10b-0ca7-46fd-860e-f45c0c031024\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbtl\" (UniqueName: \"kubernetes.io/projected/5880bb15-7341-40f4-a23b-983d2d71912f-kube-api-access-gcbtl\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976983 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jdfb\" (UniqueName: \"kubernetes.io/projected/1dffdca5-c142-4766-a823-9d817e2c5ef5-kube-api-access-4jdfb\") pod \"network-check-source-8894fc9bd-d6r6p\" (UID: \"1dffdca5-c142-4766-a823-9d817e2c5ef5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.976991 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2fed6c1-6174-4b2b-884a-12bca4486716-ca-trust-extracted\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506e9f6c-b41d-4cad-9333-d952e8630ef9-config\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977099 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977131 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977173 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zqkd\" (UniqueName: \"kubernetes.io/projected/9a154f5a-c08f-4f54-b3d7-fea632c012c6-kube-api-access-9zqkd\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977200 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-snapshots\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977232 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-certificates\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.978632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977267 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8m77\" (UniqueName: \"kubernetes.io/projected/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-kube-api-access-t8m77\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977322 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-serving-cert\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977365 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-service-ca-bundle\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977391 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cd536203-7ab7-44ff-86aa-4b70ff820188-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977424 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977453 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977502 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977521 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhh4c\" (UniqueName: \"kubernetes.io/projected/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-kube-api-access-vhh4c\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.977679 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977729 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-trusted-ca\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.977747 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-certificates\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:01.977837 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls podName:b3e8f6c3-e685-4e07-abe9-e57a6f11b37a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.477813866 +0000 UTC m=+33.216357809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-txxls" (UID: "b3e8f6c3-e685-4e07-abe9-e57a6f11b37a") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.978243 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.978259 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.979202 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.978636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-trusted-ca\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.979628 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.979138 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506e9f6c-b41d-4cad-9333-d952e8630ef9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.979628 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.979385 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-image-registry-private-configuration\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.979628 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.979463 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-installation-pull-secrets\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.979823 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.979800 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:01.981561 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.981536 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:01.986234 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.986211 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-bound-sa-token\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.986634 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.986611 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74vdb\" (UniqueName: \"kubernetes.io/projected/aebae10b-0ca7-46fd-860e-f45c0c031024-kube-api-access-74vdb\") pod \"volume-data-source-validator-7c6cbb6c87-4sh6s\" (UID: \"aebae10b-0ca7-46fd-860e-f45c0c031024\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" Apr 23 13:32:01.986833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.986777 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf6lt\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-kube-api-access-wf6lt\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:01.987151 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.987128 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhh4c\" (UniqueName: \"kubernetes.io/projected/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-kube-api-access-vhh4c\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:01.987803 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.987281 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwczn\" (UniqueName: \"kubernetes.io/projected/506e9f6c-b41d-4cad-9333-d952e8630ef9-kube-api-access-dwczn\") pod \"service-ca-operator-d6fc45fc5-9flbj\" (UID: \"506e9f6c-b41d-4cad-9333-d952e8630ef9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:01.987803 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:01.987740 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8m77\" (UniqueName: \"kubernetes.io/projected/512b3fcf-e8c1-4eb7-b755-9d8efa3083a5-kube-api-access-t8m77\") pod \"kube-storage-version-migrator-operator-6769c5d45-7xk4g\" (UID: \"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:02.007015 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.006989 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" Apr 23 13:32:02.029770 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.029726 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" Apr 23 13:32:02.079071 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.079071 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079075 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079136 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwrz\" (UniqueName: \"kubernetes.io/projected/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-kube-api-access-jxwrz\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079161 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5880bb15-7341-40f4-a23b-983d2d71912f-tmp-dir\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.079171 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079187 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-default-certificate\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-tmp\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.079226 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079238 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5880bb15-7341-40f4-a23b-983d2d71912f-config-volume\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.079296 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.079283 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.579261794 +0000 UTC m=+33.317805757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : secret "router-metrics-certs-default" not found Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079330 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbtl\" (UniqueName: \"kubernetes.io/projected/5880bb15-7341-40f4-a23b-983d2d71912f-kube-api-access-gcbtl\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.079354 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert podName:9a154f5a-c08f-4f54-b3d7-fea632c012c6 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.579338434 +0000 UTC m=+33.317882383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert") pod "ingress-canary-fzqps" (UID: "9a154f5a-c08f-4f54-b3d7-fea632c012c6") : secret "canary-serving-cert" not found Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079388 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jdfb\" (UniqueName: \"kubernetes.io/projected/1dffdca5-c142-4766-a823-9d817e2c5ef5-kube-api-access-4jdfb\") pod \"network-check-source-8894fc9bd-d6r6p\" (UID: \"1dffdca5-c142-4766-a823-9d817e2c5ef5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079576 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079619 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079653 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zqkd\" (UniqueName: \"kubernetes.io/projected/9a154f5a-c08f-4f54-b3d7-fea632c012c6-kube-api-access-9zqkd\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079616 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5880bb15-7341-40f4-a23b-983d2d71912f-tmp-dir\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079680 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-snapshots\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.079725 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.079688 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.079734 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert podName:cd536203-7ab7-44ff-86aa-4b70ff820188 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.579720692 +0000 UTC m=+33.318264651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-knl89" (UID: "cd536203-7ab7-44ff-86aa-4b70ff820188") : secret "networking-console-plugin-cert" not found Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079778 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-serving-cert\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079817 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-service-ca-bundle\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cd536203-7ab7-44ff-86aa-4b70ff820188-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079879 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079890 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-tmp\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079903 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.079912 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5880bb15-7341-40f4-a23b-983d2d71912f-config-volume\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.080004 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.080012 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.579998891 +0000 UTC m=+33.318542837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : configmap references non-existent config key: service-ca.crt Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.080035 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls podName:5880bb15-7341-40f4-a23b-983d2d71912f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.580025755 +0000 UTC m=+33.318569699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls") pod "dns-default-vf74v" (UID: "5880bb15-7341-40f4-a23b-983d2d71912f") : secret "dns-default-metrics-tls" not found Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.080085 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hlvn\" (UniqueName: \"kubernetes.io/projected/20030382-369a-4a7a-bdb3-477e6d873b00-kube-api-access-9hlvn\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.080196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.080115 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-stats-auth\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.080779 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.080537 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-snapshots\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.080779 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.080581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.080779 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.080602 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cd536203-7ab7-44ff-86aa-4b70ff820188-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:02.080919 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.080784 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-service-ca-bundle\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.082740 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.082718 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-serving-cert\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.082868 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.082743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-stats-auth\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.082868 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.082815 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-default-certificate\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.092151 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.092091 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jdfb\" (UniqueName: \"kubernetes.io/projected/1dffdca5-c142-4766-a823-9d817e2c5ef5-kube-api-access-4jdfb\") pod \"network-check-source-8894fc9bd-d6r6p\" (UID: \"1dffdca5-c142-4766-a823-9d817e2c5ef5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" Apr 23 13:32:02.092151 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.092113 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwrz\" (UniqueName: \"kubernetes.io/projected/334930fe-79d2-4d7d-9fd2-1c2db1eaf771-kube-api-access-jxwrz\") pod \"insights-operator-585dfdc468-c6hg5\" (UID: \"334930fe-79d2-4d7d-9fd2-1c2db1eaf771\") " pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.092308 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.092169 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zqkd\" (UniqueName: \"kubernetes.io/projected/9a154f5a-c08f-4f54-b3d7-fea632c012c6-kube-api-access-9zqkd\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:02.092308 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.092216 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbtl\" (UniqueName: \"kubernetes.io/projected/5880bb15-7341-40f4-a23b-983d2d71912f-kube-api-access-gcbtl\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.092408 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.092312 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" Apr 23 13:32:02.092476 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.092452 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hlvn\" (UniqueName: \"kubernetes.io/projected/20030382-369a-4a7a-bdb3-477e6d873b00-kube-api-access-9hlvn\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.109601 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.109577 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" Apr 23 13:32:02.202299 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.202270 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" Apr 23 13:32:02.383311 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.383226 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:02.383478 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.383388 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:32:02.383478 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.383464 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls podName:e7d94bc3-8733-4bd5-b1de-635975dfe4bd nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.383444057 +0000 UTC m=+34.121988015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrmk8" (UID: "e7d94bc3-8733-4bd5-b1de-635975dfe4bd") : secret "samples-operator-tls" not found Apr 23 13:32:02.484895 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.484624 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:02.484895 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.484733 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:32:02.485392 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.485356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:02.485985 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.485957 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:02.485985 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.485986 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-697697656d-5zdsf: secret "image-registry-tls" not found Apr 23 13:32:02.486126 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.486064 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls podName:e2fed6c1-6174-4b2b-884a-12bca4486716 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.48604263 +0000 UTC m=+34.224586587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls") pod "image-registry-697697656d-5zdsf" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716") : secret "image-registry-tls" not found Apr 23 13:32:02.486820 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.486675 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:02.486820 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.486748 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs podName:6d6b50d4-32de-4031-b4e3-a88d3ce08d4d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:34.486729527 +0000 UTC m=+65.225273471 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs") pod "network-metrics-daemon-gdstf" (UID: "6d6b50d4-32de-4031-b4e3-a88d3ce08d4d") : secret "metrics-daemon-secret" not found Apr 23 13:32:02.491775 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.487105 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:02.491775 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.487180 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls podName:b3e8f6c3-e685-4e07-abe9-e57a6f11b37a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.487162663 +0000 UTC m=+34.225706615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-txxls" (UID: "b3e8f6c3-e685-4e07-abe9-e57a6f11b37a") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:02.587172 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.587132 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.587188 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.587248 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.587274 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587317 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.587297472 +0000 UTC m=+34.325841440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : configmap references non-existent config key: service-ca.crt Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587364 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587397 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert podName:9a154f5a-c08f-4f54-b3d7-fea632c012c6 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.587386589 +0000 UTC m=+34.325930547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert") pod "ingress-canary-fzqps" (UID: "9a154f5a-c08f-4f54-b3d7-fea632c012c6") : secret "canary-serving-cert" not found Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.587441 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:32:02.587546 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.587469 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:02.587992 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587567 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:02.587992 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587599 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert podName:cd536203-7ab7-44ff-86aa-4b70ff820188 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.587588561 +0000 UTC m=+34.326132505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-knl89" (UID: "cd536203-7ab7-44ff-86aa-4b70ff820188") : secret "networking-console-plugin-cert" not found Apr 23 13:32:02.587992 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587645 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:02.587992 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587672 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls podName:5880bb15-7341-40f4-a23b-983d2d71912f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.587663463 +0000 UTC m=+34.326207407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls") pod "dns-default-vf74v" (UID: "5880bb15-7341-40f4-a23b-983d2d71912f") : secret "dns-default-metrics-tls" not found Apr 23 13:32:02.587992 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587714 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:32:02.587992 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:02.587741 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.587731971 +0000 UTC m=+34.326275916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : secret "router-metrics-certs-default" not found Apr 23 13:32:02.597747 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.597190 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqlf\" (UniqueName: \"kubernetes.io/projected/d6413ec2-e315-417e-9b7d-ce057e4f10a3-kube-api-access-hxqlf\") pod \"network-check-target-hclwj\" (UID: \"d6413ec2-e315-417e-9b7d-ce057e4f10a3\") " pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:32:02.709831 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.709797 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rtfn8"] Apr 23 13:32:02.712349 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.712326 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-c6hg5"] Apr 23 13:32:02.722315 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.722297 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p"] Apr 23 13:32:02.728401 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.728376 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g"] Apr 23 13:32:02.731434 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.731403 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a5e8b5_8ca7_40e3_978f_439d854e09b0.slice/crio-02fc192ddb67d5f6e02ecb17b6c2f987e6491f7850fdfefa10e3caca28056dd3 WatchSource:0}: Error finding container 02fc192ddb67d5f6e02ecb17b6c2f987e6491f7850fdfefa10e3caca28056dd3: Status 404 returned error can't find the container with id 02fc192ddb67d5f6e02ecb17b6c2f987e6491f7850fdfefa10e3caca28056dd3 Apr 23 13:32:02.731636 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.731615 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod334930fe_79d2_4d7d_9fd2_1c2db1eaf771.slice/crio-1c21a9fa285bfa8c61f068fcbd537c0f3d4903a023d7b1d847716cffb9503c93 WatchSource:0}: Error finding container 1c21a9fa285bfa8c61f068fcbd537c0f3d4903a023d7b1d847716cffb9503c93: Status 404 returned error can't find the container with id 1c21a9fa285bfa8c61f068fcbd537c0f3d4903a023d7b1d847716cffb9503c93 Apr 23 13:32:02.732323 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.732210 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dffdca5_c142_4766_a823_9d817e2c5ef5.slice/crio-e2dc43bbe1f740533f16b9152baa357e53b8428a94be05f9f88230128feb6aca WatchSource:0}: Error finding container e2dc43bbe1f740533f16b9152baa357e53b8428a94be05f9f88230128feb6aca: Status 404 returned error can't find the container with id e2dc43bbe1f740533f16b9152baa357e53b8428a94be05f9f88230128feb6aca Apr 23 13:32:02.733438 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.733185 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod512b3fcf_e8c1_4eb7_b755_9d8efa3083a5.slice/crio-f261357836dc9ec4df511d1f5494e715e034e6844332e4f258eab06715da5ba3 WatchSource:0}: Error finding container f261357836dc9ec4df511d1f5494e715e034e6844332e4f258eab06715da5ba3: Status 404 returned error can't find the container with id f261357836dc9ec4df511d1f5494e715e034e6844332e4f258eab06715da5ba3 Apr 23 13:32:02.736984 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.736965 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s"] Apr 23 13:32:02.738504 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.738483 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj"] Apr 23 13:32:02.745190 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.745169 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaebae10b_0ca7_46fd_860e_f45c0c031024.slice/crio-4a6cc11da269604277b040cb8daf38c2ae3786693944e999a74e2ade41d28c87 WatchSource:0}: Error finding container 4a6cc11da269604277b040cb8daf38c2ae3786693944e999a74e2ade41d28c87: Status 404 returned error can't find the container with id 4a6cc11da269604277b040cb8daf38c2ae3786693944e999a74e2ade41d28c87 Apr 23 13:32:02.745578 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.745564 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506e9f6c_b41d_4cad_9333_d952e8630ef9.slice/crio-ae2503e2248744fb7c50d7a5a0cda3f172c64f0d93845a161944ca7517fdf35e WatchSource:0}: Error finding container ae2503e2248744fb7c50d7a5a0cda3f172c64f0d93845a161944ca7517fdf35e: Status 404 returned error can't find the container with id ae2503e2248744fb7c50d7a5a0cda3f172c64f0d93845a161944ca7517fdf35e Apr 23 13:32:02.794632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.794598 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:32:02.918077 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.917833 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hclwj"] Apr 23 13:32:02.923933 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:02.923907 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6413ec2_e315_417e_9b7d_ce057e4f10a3.slice/crio-31efa1645a2e93f6c48161d954cea9721442de01398f2604ad6ca296a61a185c WatchSource:0}: Error finding container 31efa1645a2e93f6c48161d954cea9721442de01398f2604ad6ca296a61a185c: Status 404 returned error can't find the container with id 31efa1645a2e93f6c48161d954cea9721442de01398f2604ad6ca296a61a185c Apr 23 13:32:02.994662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.994632 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" event={"ID":"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5","Type":"ContainerStarted","Data":"f261357836dc9ec4df511d1f5494e715e034e6844332e4f258eab06715da5ba3"} Apr 23 13:32:02.999908 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:02.999874 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerStarted","Data":"3dffe0d39308966e0cd9bcc4de15f714a104d271ae9dcb2420b74fce3147d4bd"} Apr 23 13:32:03.001162 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.001126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" event={"ID":"1dffdca5-c142-4766-a823-9d817e2c5ef5","Type":"ContainerStarted","Data":"e2dc43bbe1f740533f16b9152baa357e53b8428a94be05f9f88230128feb6aca"} Apr 23 13:32:03.002242 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.002219 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" event={"ID":"aebae10b-0ca7-46fd-860e-f45c0c031024","Type":"ContainerStarted","Data":"4a6cc11da269604277b040cb8daf38c2ae3786693944e999a74e2ade41d28c87"} Apr 23 13:32:03.003190 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.003168 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" event={"ID":"506e9f6c-b41d-4cad-9333-d952e8630ef9","Type":"ContainerStarted","Data":"ae2503e2248744fb7c50d7a5a0cda3f172c64f0d93845a161944ca7517fdf35e"} Apr 23 13:32:03.004567 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.004547 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" event={"ID":"334930fe-79d2-4d7d-9fd2-1c2db1eaf771","Type":"ContainerStarted","Data":"1c21a9fa285bfa8c61f068fcbd537c0f3d4903a023d7b1d847716cffb9503c93"} Apr 23 13:32:03.005618 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.005601 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hclwj" event={"ID":"d6413ec2-e315-417e-9b7d-ce057e4f10a3","Type":"ContainerStarted","Data":"31efa1645a2e93f6c48161d954cea9721442de01398f2604ad6ca296a61a185c"} Apr 23 13:32:03.006512 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.006492 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" event={"ID":"34a5e8b5-8ca7-40e3-978f-439d854e09b0","Type":"ContainerStarted","Data":"02fc192ddb67d5f6e02ecb17b6c2f987e6491f7850fdfefa10e3caca28056dd3"} Apr 23 13:32:03.396295 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.396234 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:03.396504 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.396411 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:32:03.396703 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.396616 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls podName:e7d94bc3-8733-4bd5-b1de-635975dfe4bd nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.39658972 +0000 UTC m=+36.135133667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrmk8" (UID: "e7d94bc3-8733-4bd5-b1de-635975dfe4bd") : secret "samples-operator-tls" not found Apr 23 13:32:03.498193 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.498110 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:03.498378 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.498293 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:03.498378 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.498374 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls podName:b3e8f6c3-e685-4e07-abe9-e57a6f11b37a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.498353299 +0000 UTC m=+36.236897258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-txxls" (UID: "b3e8f6c3-e685-4e07-abe9-e57a6f11b37a") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:03.498781 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.498556 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:03.498980 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.498875 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:03.498980 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.498899 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-697697656d-5zdsf: secret "image-registry-tls" not found Apr 23 13:32:03.498980 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.498958 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls podName:e2fed6c1-6174-4b2b-884a-12bca4486716 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.498941648 +0000 UTC m=+36.237485597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls") pod "image-registry-697697656d-5zdsf" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716") : secret "image-registry-tls" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.599382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.599459 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.599486 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.599535 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:03.599566 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.599702 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.599778 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert podName:9a154f5a-c08f-4f54-b3d7-fea632c012c6 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.599740701 +0000 UTC m=+36.338284652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert") pod "ingress-canary-fzqps" (UID: "9a154f5a-c08f-4f54-b3d7-fea632c012c6") : secret "canary-serving-cert" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600176 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600220 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert podName:cd536203-7ab7-44ff-86aa-4b70ff820188 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.600206069 +0000 UTC m=+36.338750020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-knl89" (UID: "cd536203-7ab7-44ff-86aa-4b70ff820188") : secret "networking-console-plugin-cert" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600284 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.60027353 +0000 UTC m=+36.338817476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : configmap references non-existent config key: service-ca.crt Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600335 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600361 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls podName:5880bb15-7341-40f4-a23b-983d2d71912f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.600351477 +0000 UTC m=+36.338895424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls") pod "dns-default-vf74v" (UID: "5880bb15-7341-40f4-a23b-983d2d71912f") : secret "dns-default-metrics-tls" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600408 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:32:03.600472 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:03.600435 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:05.600425953 +0000 UTC m=+36.338969900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : secret "router-metrics-certs-default" not found Apr 23 13:32:04.027522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:04.026704 2565 generic.go:358] "Generic (PLEG): container finished" podID="967ed5b3-0337-40d9-872d-aa7a02b7c552" containerID="3dffe0d39308966e0cd9bcc4de15f714a104d271ae9dcb2420b74fce3147d4bd" exitCode=0 Apr 23 13:32:04.027522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:04.026748 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerDied","Data":"3dffe0d39308966e0cd9bcc4de15f714a104d271ae9dcb2420b74fce3147d4bd"} Apr 23 13:32:05.038332 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.038292 2565 generic.go:358] "Generic (PLEG): container finished" podID="967ed5b3-0337-40d9-872d-aa7a02b7c552" containerID="d9256d998d1b396b15c494c7f45895424047d2d9c86fb73240b00697feb4eb15" exitCode=0 Apr 23 13:32:05.038992 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.038358 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerDied","Data":"d9256d998d1b396b15c494c7f45895424047d2d9c86fb73240b00697feb4eb15"} Apr 23 13:32:05.421460 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.421380 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:05.421664 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.421549 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:32:05.421664 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.421632 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls podName:e7d94bc3-8733-4bd5-b1de-635975dfe4bd nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.421609219 +0000 UTC m=+40.160153170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrmk8" (UID: "e7d94bc3-8733-4bd5-b1de-635975dfe4bd") : secret "samples-operator-tls" not found Apr 23 13:32:05.523153 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.522416 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:05.523153 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.522600 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:05.523153 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.522732 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:05.523153 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.522811 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls podName:b3e8f6c3-e685-4e07-abe9-e57a6f11b37a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.5227927 +0000 UTC m=+40.261336658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-txxls" (UID: "b3e8f6c3-e685-4e07-abe9-e57a6f11b37a") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:05.523153 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.523116 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:05.523153 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.523133 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-697697656d-5zdsf: secret "image-registry-tls" not found Apr 23 13:32:05.523586 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.523182 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls podName:e2fed6c1-6174-4b2b-884a-12bca4486716 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.523167332 +0000 UTC m=+40.261711286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls") pod "image-registry-697697656d-5zdsf" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716") : secret "image-registry-tls" not found Apr 23 13:32:05.623939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.623900 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:05.624106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.623980 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:05.624106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.624004 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624428 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624439 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.624418018 +0000 UTC m=+40.362961967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : configmap references non-existent config key: service-ca.crt Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624537 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert podName:cd536203-7ab7-44ff-86aa-4b70ff820188 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.624524783 +0000 UTC m=+40.363068732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-knl89" (UID: "cd536203-7ab7-44ff-86aa-4b70ff820188") : secret "networking-console-plugin-cert" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624539 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624580 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls podName:5880bb15-7341-40f4-a23b-983d2d71912f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.62456207 +0000 UTC m=+40.363106021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls") pod "dns-default-vf74v" (UID: "5880bb15-7341-40f4-a23b-983d2d71912f") : secret "dns-default-metrics-tls" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.624054 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624728 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:05.624792 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624801 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.624787173 +0000 UTC m=+40.363331121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : secret "router-metrics-certs-default" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624862 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:05.624932 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:05.624899 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert podName:9a154f5a-c08f-4f54-b3d7-fea632c012c6 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:09.624887316 +0000 UTC m=+40.363431265 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert") pod "ingress-canary-fzqps" (UID: "9a154f5a-c08f-4f54-b3d7-fea632c012c6") : secret "canary-serving-cert" not found Apr 23 13:32:09.461727 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.461539 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:09.462187 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.461704 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:32:09.462187 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.461848 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls podName:e7d94bc3-8733-4bd5-b1de-635975dfe4bd nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.461829827 +0000 UTC m=+48.200373775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrmk8" (UID: "e7d94bc3-8733-4bd5-b1de-635975dfe4bd") : secret "samples-operator-tls" not found Apr 23 13:32:09.562477 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.562444 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:09.562713 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.562617 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:09.562713 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.562666 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-697697656d-5zdsf: secret "image-registry-tls" not found Apr 23 13:32:09.562713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.562666 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:09.562902 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.562732 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls podName:e2fed6c1-6174-4b2b-884a-12bca4486716 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.562708462 +0000 UTC m=+48.301252424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls") pod "image-registry-697697656d-5zdsf" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716") : secret "image-registry-tls" not found Apr 23 13:32:09.562902 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.562818 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:09.562902 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.562868 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls podName:b3e8f6c3-e685-4e07-abe9-e57a6f11b37a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.562854853 +0000 UTC m=+48.301398806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-txxls" (UID: "b3e8f6c3-e685-4e07-abe9-e57a6f11b37a") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:09.663661 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.663628 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:32:09.663854 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.663677 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:09.663854 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.663839 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:09.663934 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.663871 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:09.663934 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.663908 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:09.663934 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.663847 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:09.663934 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.663929 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:09.664115 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.663976 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert podName:cd536203-7ab7-44ff-86aa-4b70ff820188 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.663956592 +0000 UTC m=+48.402500536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-knl89" (UID: "cd536203-7ab7-44ff-86aa-4b70ff820188") : secret "networking-console-plugin-cert" not found Apr 23 13:32:09.664115 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.663995 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.663987379 +0000 UTC m=+48.402531322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : configmap references non-existent config key: service-ca.crt Apr 23 13:32:09.664115 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.664077 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:09.664115 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.664107 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:09.664290 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.664121 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:32:09.664290 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.664127 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert podName:9a154f5a-c08f-4f54-b3d7-fea632c012c6 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.664112416 +0000 UTC m=+48.402656379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert") pod "ingress-canary-fzqps" (UID: "9a154f5a-c08f-4f54-b3d7-fea632c012c6") : secret "canary-serving-cert" not found Apr 23 13:32:09.664290 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.664195 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls podName:5880bb15-7341-40f4-a23b-983d2d71912f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.664179035 +0000 UTC m=+48.402722997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls") pod "dns-default-vf74v" (UID: "5880bb15-7341-40f4-a23b-983d2d71912f") : secret "dns-default-metrics-tls" not found Apr 23 13:32:09.664290 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:09.664212 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.664202558 +0000 UTC m=+48.402746509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : secret "router-metrics-certs-default" not found Apr 23 13:32:09.667294 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.667276 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1344ba49-27f1-41a6-94d2-2e85595b528d-original-pull-secret\") pod \"global-pull-secret-syncer-ptldl\" (UID: \"1344ba49-27f1-41a6-94d2-2e85595b528d\") " pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:32:09.960389 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:09.960370 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ptldl" Apr 23 13:32:10.051522 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:10.051496 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" event={"ID":"967ed5b3-0337-40d9-872d-aa7a02b7c552","Type":"ContainerStarted","Data":"aaafd95b40e9c4ae95853520b2851832df6f896542b6fa4f1656fc73a0ff02f8"} Apr 23 13:32:10.088009 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:10.087988 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ptldl"] Apr 23 13:32:10.093183 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:10.093143 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mvwgw" podStartSLOduration=8.373680142 podStartE2EDuration="40.093129512s" podCreationTimestamp="2026-04-23 13:31:30 +0000 UTC" firstStartedPulling="2026-04-23 13:31:31.051983942 +0000 UTC m=+1.790527900" lastFinishedPulling="2026-04-23 13:32:02.771433312 +0000 UTC m=+33.509977270" observedRunningTime="2026-04-23 13:32:10.091605589 +0000 UTC m=+40.830149555" watchObservedRunningTime="2026-04-23 13:32:10.093129512 +0000 UTC m=+40.831673525" Apr 23 13:32:10.095739 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:10.095713 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1344ba49_27f1_41a6_94d2_2e85595b528d.slice/crio-9c5e7397445a402a4967ac50ad7a86c86575de218b5880f8c1865eb96874feec WatchSource:0}: Error finding container 9c5e7397445a402a4967ac50ad7a86c86575de218b5880f8c1865eb96874feec: Status 404 returned error can't find the container with id 9c5e7397445a402a4967ac50ad7a86c86575de218b5880f8c1865eb96874feec Apr 23 13:32:11.064479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.064207 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hclwj" event={"ID":"d6413ec2-e315-417e-9b7d-ce057e4f10a3","Type":"ContainerStarted","Data":"569774f5beae01dd69cfec02fd80b872f74abd6b1eef8d42fdc73a04789f50a5"} Apr 23 13:32:11.065024 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.064594 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:32:11.067571 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.066993 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/0.log" Apr 23 13:32:11.067571 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.067028 2565 generic.go:358] "Generic (PLEG): container finished" podID="34a5e8b5-8ca7-40e3-978f-439d854e09b0" containerID="0094236cfc4f32d5204e5cd71a45647792ef1ba9a3a6ea6d092a0c6e6a42cdc8" exitCode=255 Apr 23 13:32:11.067571 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.067151 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" event={"ID":"34a5e8b5-8ca7-40e3-978f-439d854e09b0","Type":"ContainerDied","Data":"0094236cfc4f32d5204e5cd71a45647792ef1ba9a3a6ea6d092a0c6e6a42cdc8"} Apr 23 13:32:11.067571 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.067339 2565 scope.go:117] "RemoveContainer" containerID="0094236cfc4f32d5204e5cd71a45647792ef1ba9a3a6ea6d092a0c6e6a42cdc8" Apr 23 13:32:11.070703 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.070297 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" event={"ID":"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5","Type":"ContainerStarted","Data":"306d976278ec086844073083ff1522576cca3fde13b4903c6a7ac844b3a13774"} Apr 23 13:32:11.072130 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.072088 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" event={"ID":"1dffdca5-c142-4766-a823-9d817e2c5ef5","Type":"ContainerStarted","Data":"ee7c1b4ff5b232a73677bd75558509bb546ebd72a60ef33b3b0a543cfdac314f"} Apr 23 13:32:11.073514 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.073491 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" event={"ID":"aebae10b-0ca7-46fd-860e-f45c0c031024","Type":"ContainerStarted","Data":"57a799d9b1281c71c6f705d7cd7aff991599180281cd4494481ae7a231093c74"} Apr 23 13:32:11.074924 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.074877 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ptldl" event={"ID":"1344ba49-27f1-41a6-94d2-2e85595b528d","Type":"ContainerStarted","Data":"9c5e7397445a402a4967ac50ad7a86c86575de218b5880f8c1865eb96874feec"} Apr 23 13:32:11.076456 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.076424 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" event={"ID":"506e9f6c-b41d-4cad-9333-d952e8630ef9","Type":"ContainerStarted","Data":"581354dd5445c55f510232943ffcac2c3271538a8e0752f728381e8084b85f72"} Apr 23 13:32:11.079094 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.079056 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" event={"ID":"334930fe-79d2-4d7d-9fd2-1c2db1eaf771","Type":"ContainerStarted","Data":"49158717f3939a570bf4a6b848bc2a6174b59676839813904bd2a486a83b4450"} Apr 23 13:32:11.081778 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.081714 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hclwj" podStartSLOduration=35.0311728 podStartE2EDuration="42.081701811s" podCreationTimestamp="2026-04-23 13:31:29 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.925710821 +0000 UTC m=+33.664254765" lastFinishedPulling="2026-04-23 13:32:09.976239826 +0000 UTC m=+40.714783776" observedRunningTime="2026-04-23 13:32:11.080160937 +0000 UTC m=+41.818704900" watchObservedRunningTime="2026-04-23 13:32:11.081701811 +0000 UTC m=+41.820245779" Apr 23 13:32:11.101911 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.096640 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" podStartSLOduration=26.879680693 podStartE2EDuration="34.096627912s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.748445827 +0000 UTC m=+33.486989774" lastFinishedPulling="2026-04-23 13:32:09.965393032 +0000 UTC m=+40.703936993" observedRunningTime="2026-04-23 13:32:11.096113857 +0000 UTC m=+41.834657824" watchObservedRunningTime="2026-04-23 13:32:11.096627912 +0000 UTC m=+41.835171880" Apr 23 13:32:11.145192 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.145033 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d6r6p" podStartSLOduration=26.928396314 podStartE2EDuration="34.145015121s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.748327738 +0000 UTC m=+33.486871696" lastFinishedPulling="2026-04-23 13:32:09.964946544 +0000 UTC m=+40.703490503" observedRunningTime="2026-04-23 13:32:11.128124497 +0000 UTC m=+41.866668464" watchObservedRunningTime="2026-04-23 13:32:11.145015121 +0000 UTC m=+41.883559088" Apr 23 13:32:11.164910 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.164860 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4sh6s" podStartSLOduration=26.948057191 podStartE2EDuration="34.164846282s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.748062297 +0000 UTC m=+33.486606241" lastFinishedPulling="2026-04-23 13:32:09.964851381 +0000 UTC m=+40.703395332" observedRunningTime="2026-04-23 13:32:11.16416938 +0000 UTC m=+41.902713346" watchObservedRunningTime="2026-04-23 13:32:11.164846282 +0000 UTC m=+41.903390245" Apr 23 13:32:11.181481 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.179782 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" podStartSLOduration=27.013885593 podStartE2EDuration="34.179742306s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.748425897 +0000 UTC m=+33.486969844" lastFinishedPulling="2026-04-23 13:32:09.914282599 +0000 UTC m=+40.652826557" observedRunningTime="2026-04-23 13:32:11.179115434 +0000 UTC m=+41.917659620" watchObservedRunningTime="2026-04-23 13:32:11.179742306 +0000 UTC m=+41.918286274" Apr 23 13:32:11.467501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.467445 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" podStartSLOduration=27.250571897 podStartE2EDuration="34.467424606s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.748098684 +0000 UTC m=+33.486642643" lastFinishedPulling="2026-04-23 13:32:09.964951393 +0000 UTC m=+40.703495352" observedRunningTime="2026-04-23 13:32:11.201516862 +0000 UTC m=+41.940060831" watchObservedRunningTime="2026-04-23 13:32:11.467424606 +0000 UTC m=+42.205968572" Apr 23 13:32:11.467856 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.467636 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb"] Apr 23 13:32:11.492917 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.492409 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb"] Apr 23 13:32:11.492917 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.492548 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" Apr 23 13:32:11.495749 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.495569 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-nbwnh\"" Apr 23 13:32:11.495749 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.495616 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:11.495941 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.495572 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 13:32:11.583820 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.583781 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hkxp\" (UniqueName: \"kubernetes.io/projected/8ca1b051-dd5d-4c97-9809-95d139a9d692-kube-api-access-2hkxp\") pod \"migrator-74bb7799d9-fmvmb\" (UID: \"8ca1b051-dd5d-4c97-9809-95d139a9d692\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" Apr 23 13:32:11.684928 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.684894 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hkxp\" (UniqueName: \"kubernetes.io/projected/8ca1b051-dd5d-4c97-9809-95d139a9d692-kube-api-access-2hkxp\") pod \"migrator-74bb7799d9-fmvmb\" (UID: \"8ca1b051-dd5d-4c97-9809-95d139a9d692\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" Apr 23 13:32:11.694957 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.694904 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hkxp\" (UniqueName: \"kubernetes.io/projected/8ca1b051-dd5d-4c97-9809-95d139a9d692-kube-api-access-2hkxp\") pod \"migrator-74bb7799d9-fmvmb\" (UID: \"8ca1b051-dd5d-4c97-9809-95d139a9d692\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" Apr 23 13:32:11.805869 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.805794 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" Apr 23 13:32:11.981609 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.981482 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb"] Apr 23 13:32:11.981794 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.981661 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:11.981794 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:11.981687 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:11.984996 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:11.984489 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ca1b051_dd5d_4c97_9809_95d139a9d692.slice/crio-484b1c0cf39e2f0e88b89f685951c96179785e0ab9fe4a182983bd4e417741c7 WatchSource:0}: Error finding container 484b1c0cf39e2f0e88b89f685951c96179785e0ab9fe4a182983bd4e417741c7: Status 404 returned error can't find the container with id 484b1c0cf39e2f0e88b89f685951c96179785e0ab9fe4a182983bd4e417741c7 Apr 23 13:32:12.083498 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.083403 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" event={"ID":"8ca1b051-dd5d-4c97-9809-95d139a9d692","Type":"ContainerStarted","Data":"484b1c0cf39e2f0e88b89f685951c96179785e0ab9fe4a182983bd4e417741c7"} Apr 23 13:32:12.085176 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.085146 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:32:12.085777 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.085736 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/0.log" Apr 23 13:32:12.085967 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.085794 2565 generic.go:358] "Generic (PLEG): container finished" podID="34a5e8b5-8ca7-40e3-978f-439d854e09b0" containerID="9f90c3e6b9979ff2f3127d2d6719f28ba04bb1058e436c0eeb5927aca9dfaf6d" exitCode=255 Apr 23 13:32:12.085967 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.085924 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" event={"ID":"34a5e8b5-8ca7-40e3-978f-439d854e09b0","Type":"ContainerDied","Data":"9f90c3e6b9979ff2f3127d2d6719f28ba04bb1058e436c0eeb5927aca9dfaf6d"} Apr 23 13:32:12.085967 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.085954 2565 scope.go:117] "RemoveContainer" containerID="0094236cfc4f32d5204e5cd71a45647792ef1ba9a3a6ea6d092a0c6e6a42cdc8" Apr 23 13:32:12.086272 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:12.086164 2565 scope.go:117] "RemoveContainer" containerID="9f90c3e6b9979ff2f3127d2d6719f28ba04bb1058e436c0eeb5927aca9dfaf6d" Apr 23 13:32:12.086583 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:12.086384 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtfn8_openshift-console-operator(34a5e8b5-8ca7-40e3-978f-439d854e09b0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" podUID="34a5e8b5-8ca7-40e3-978f-439d854e09b0" Apr 23 13:32:13.094153 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:13.094122 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:32:13.094614 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:13.094543 2565 scope.go:117] "RemoveContainer" containerID="9f90c3e6b9979ff2f3127d2d6719f28ba04bb1058e436c0eeb5927aca9dfaf6d" Apr 23 13:32:13.094796 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:13.094754 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtfn8_openshift-console-operator(34a5e8b5-8ca7-40e3-978f-439d854e09b0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" podUID="34a5e8b5-8ca7-40e3-978f-439d854e09b0" Apr 23 13:32:14.533373 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:14.533348 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5s22_4976cf12-11ff-427a-a58d-9f126da4f625/dns-node-resolver/0.log" Apr 23 13:32:15.101368 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:15.101325 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" event={"ID":"8ca1b051-dd5d-4c97-9809-95d139a9d692","Type":"ContainerStarted","Data":"8c58b5bf5036f7f3dbf5485344d489062eca0801cf3c91ca19193e0256ee286c"} Apr 23 13:32:15.535133 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:15.535104 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rjv7k_3a6f5afc-ae97-4be4-ad1c-c3af1a35a586/node-ca/0.log" Apr 23 13:32:16.105962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.105925 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ptldl" event={"ID":"1344ba49-27f1-41a6-94d2-2e85595b528d","Type":"ContainerStarted","Data":"08e011229ddc28916f443094b89cc3b3355ce71634f7e92b7476e3038d8ceccd"} Apr 23 13:32:16.107533 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.107507 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" event={"ID":"8ca1b051-dd5d-4c97-9809-95d139a9d692","Type":"ContainerStarted","Data":"ed183658e4f2510c4ebccc929a332b0a4ec56685c4fc27fcef4345d2e4d88242"} Apr 23 13:32:16.123652 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.123604 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ptldl" podStartSLOduration=34.256804728 podStartE2EDuration="39.123591055s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.097412345 +0000 UTC m=+40.835956290" lastFinishedPulling="2026-04-23 13:32:14.96419866 +0000 UTC m=+45.702742617" observedRunningTime="2026-04-23 13:32:16.123218747 +0000 UTC m=+46.861762713" watchObservedRunningTime="2026-04-23 13:32:16.123591055 +0000 UTC m=+46.862135022" Apr 23 13:32:16.147047 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.147004 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fmvmb" podStartSLOduration=2.183459828 podStartE2EDuration="5.146990555s" podCreationTimestamp="2026-04-23 13:32:11 +0000 UTC" firstStartedPulling="2026-04-23 13:32:11.989046002 +0000 UTC m=+42.727589949" lastFinishedPulling="2026-04-23 13:32:14.952576728 +0000 UTC m=+45.691120676" observedRunningTime="2026-04-23 13:32:16.14674261 +0000 UTC m=+46.885286576" watchObservedRunningTime="2026-04-23 13:32:16.146990555 +0000 UTC m=+46.885534520" Apr 23 13:32:16.336096 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.336042 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fmvmb_8ca1b051-dd5d-4c97-9809-95d139a9d692/migrator/0.log" Apr 23 13:32:16.538546 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.538517 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fmvmb_8ca1b051-dd5d-4c97-9809-95d139a9d692/graceful-termination/0.log" Apr 23 13:32:16.735434 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:16.735401 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7xk4g_512b3fcf-e8c1-4eb7-b755-9d8efa3083a5/kube-storage-version-migrator-operator/0.log" Apr 23 13:32:17.537640 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.537606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:17.537826 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.537745 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:32:17.537826 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.537821 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls podName:e7d94bc3-8733-4bd5-b1de-635975dfe4bd nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.537804992 +0000 UTC m=+64.276348942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrmk8" (UID: "e7d94bc3-8733-4bd5-b1de-635975dfe4bd") : secret "samples-operator-tls" not found Apr 23 13:32:17.638233 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.638198 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:17.638585 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.638261 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:17.638585 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.638348 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:17.638585 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.638411 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls podName:b3e8f6c3-e685-4e07-abe9-e57a6f11b37a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.638396727 +0000 UTC m=+64.376940671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-txxls" (UID: "b3e8f6c3-e685-4e07-abe9-e57a6f11b37a") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:32:17.638585 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.638356 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:17.638585 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.638448 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-697697656d-5zdsf: secret "image-registry-tls" not found Apr 23 13:32:17.638794 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.638536 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls podName:e2fed6c1-6174-4b2b-884a-12bca4486716 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.638520125 +0000 UTC m=+64.377064092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls") pod "image-registry-697697656d-5zdsf" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716") : secret "image-registry-tls" not found Apr 23 13:32:17.738959 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.738922 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:17.738959 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.738962 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:17.739145 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.739036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:17.739145 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739054 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:32:17.739145 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.739082 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:17.739145 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739113 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.739098329 +0000 UTC m=+64.477642278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : secret "router-metrics-certs-default" not found Apr 23 13:32:17.739145 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:17.739137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739142 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739174 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle podName:20030382-369a-4a7a-bdb3-477e6d873b00 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.739160285 +0000 UTC m=+64.477704242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle") pod "router-default-7b994fd948-vpkgz" (UID: "20030382-369a-4a7a-bdb3-477e6d873b00") : configmap references non-existent config key: service-ca.crt Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739214 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739233 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert podName:9a154f5a-c08f-4f54-b3d7-fea632c012c6 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.73922234 +0000 UTC m=+64.477766287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert") pod "ingress-canary-fzqps" (UID: "9a154f5a-c08f-4f54-b3d7-fea632c012c6") : secret "canary-serving-cert" not found Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739234 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739248 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert podName:cd536203-7ab7-44ff-86aa-4b70ff820188 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.739239891 +0000 UTC m=+64.477783836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-knl89" (UID: "cd536203-7ab7-44ff-86aa-4b70ff820188") : secret "networking-console-plugin-cert" not found Apr 23 13:32:17.739326 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:17.739282 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls podName:5880bb15-7341-40f4-a23b-983d2d71912f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:33.739271336 +0000 UTC m=+64.477815280 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls") pod "dns-default-vf74v" (UID: "5880bb15-7341-40f4-a23b-983d2d71912f") : secret "dns-default-metrics-tls" not found Apr 23 13:32:21.981940 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:21.981907 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:21.981940 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:21.981945 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:21.982454 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:21.982269 2565 scope.go:117] "RemoveContainer" containerID="9f90c3e6b9979ff2f3127d2d6719f28ba04bb1058e436c0eeb5927aca9dfaf6d" Apr 23 13:32:22.126281 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:22.126251 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:32:22.126453 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:22.126351 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" event={"ID":"34a5e8b5-8ca7-40e3-978f-439d854e09b0","Type":"ContainerStarted","Data":"fd4a9abadab6832da6a66bd9eec62b24ba38425cad709d54ac4add533c389625"} Apr 23 13:32:22.126706 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:22.126685 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:22.145720 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:22.145676 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" podStartSLOduration=37.941395433 podStartE2EDuration="45.145663598s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:02.748119466 +0000 UTC m=+33.486663424" lastFinishedPulling="2026-04-23 13:32:09.952387645 +0000 UTC m=+40.690931589" observedRunningTime="2026-04-23 13:32:22.144819856 +0000 UTC m=+52.883363828" watchObservedRunningTime="2026-04-23 13:32:22.145663598 +0000 UTC m=+52.884207592" Apr 23 13:32:22.300113 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:22.300030 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtfn8" Apr 23 13:32:27.995829 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:27.995796 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxhp2" Apr 23 13:32:33.571596 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.571560 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:33.573908 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.573888 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7d94bc3-8733-4bd5-b1de-635975dfe4bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrmk8\" (UID: \"e7d94bc3-8733-4bd5-b1de-635975dfe4bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:33.672177 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.672135 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:33.672344 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.672214 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:33.674526 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.674495 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3e8f6c3-e685-4e07-abe9-e57a6f11b37a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-txxls\" (UID: \"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:33.674643 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.674627 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"image-registry-697697656d-5zdsf\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:33.763102 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.763069 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-7pnvt\"" Apr 23 13:32:33.770630 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.770608 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" Apr 23 13:32:33.773570 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.773542 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:33.773632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.773583 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:33.773684 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.773668 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:33.773738 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.773724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:33.773833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.773804 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:33.774399 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.774372 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20030382-369a-4a7a-bdb3-477e6d873b00-service-ca-bundle\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:33.776207 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.776184 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a154f5a-c08f-4f54-b3d7-fea632c012c6-cert\") pod \"ingress-canary-fzqps\" (UID: \"9a154f5a-c08f-4f54-b3d7-fea632c012c6\") " pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:33.776289 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.776209 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20030382-369a-4a7a-bdb3-477e6d873b00-metrics-certs\") pod \"router-default-7b994fd948-vpkgz\" (UID: \"20030382-369a-4a7a-bdb3-477e6d873b00\") " pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:33.776386 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.776366 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5880bb15-7341-40f4-a23b-983d2d71912f-metrics-tls\") pod \"dns-default-vf74v\" (UID: \"5880bb15-7341-40f4-a23b-983d2d71912f\") " pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:33.776511 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.776490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd536203-7ab7-44ff-86aa-4b70ff820188-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-knl89\" (UID: \"cd536203-7ab7-44ff-86aa-4b70ff820188\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:33.857018 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.856994 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cq79g\"" Apr 23 13:32:33.864364 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.864342 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:33.874753 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.874726 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-ftzcm\"" Apr 23 13:32:33.882090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.882067 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" Apr 23 13:32:33.888807 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.888701 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8"] Apr 23 13:32:33.930543 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.930519 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-qt8c2\"" Apr 23 13:32:33.937664 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.937638 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:33.945626 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.945606 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7dv25\"" Apr 23 13:32:33.953794 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.953743 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" Apr 23 13:32:33.983787 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.981927 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4t8gv\"" Apr 23 13:32:33.987376 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:33.987354 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:34.003832 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.003779 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697697656d-5zdsf"] Apr 23 13:32:34.007455 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:34.007419 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2fed6c1_6174_4b2b_884a_12bca4486716.slice/crio-973bc633e0e6857ca4cfaa8ce7e341ede173eae3288f1a7dfddc08f6c8895bc1 WatchSource:0}: Error finding container 973bc633e0e6857ca4cfaa8ce7e341ede173eae3288f1a7dfddc08f6c8895bc1: Status 404 returned error can't find the container with id 973bc633e0e6857ca4cfaa8ce7e341ede173eae3288f1a7dfddc08f6c8895bc1 Apr 23 13:32:34.012347 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.011690 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4c4k9\"" Apr 23 13:32:34.018850 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.018380 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fzqps" Apr 23 13:32:34.028021 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.027965 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls"] Apr 23 13:32:34.034872 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:34.034839 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e8f6c3_e685_4e07_abe9_e57a6f11b37a.slice/crio-b6bf344dd4f2aefa2c20b25879fadb442155e60ab90edc828050993020f12566 WatchSource:0}: Error finding container b6bf344dd4f2aefa2c20b25879fadb442155e60ab90edc828050993020f12566: Status 404 returned error can't find the container with id b6bf344dd4f2aefa2c20b25879fadb442155e60ab90edc828050993020f12566 Apr 23 13:32:34.106496 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.105184 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-knl89"] Apr 23 13:32:34.159807 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.159750 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697697656d-5zdsf" event={"ID":"e2fed6c1-6174-4b2b-884a-12bca4486716","Type":"ContainerStarted","Data":"7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d"} Apr 23 13:32:34.159930 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.159813 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697697656d-5zdsf" event={"ID":"e2fed6c1-6174-4b2b-884a-12bca4486716","Type":"ContainerStarted","Data":"973bc633e0e6857ca4cfaa8ce7e341ede173eae3288f1a7dfddc08f6c8895bc1"} Apr 23 13:32:34.159930 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.159870 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:32:34.161051 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.161028 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" event={"ID":"cd536203-7ab7-44ff-86aa-4b70ff820188","Type":"ContainerStarted","Data":"f04e38ecc7cbe5197b8c035356d84f41f36e08eb24017c4191d48340a0dd1f77"} Apr 23 13:32:34.162236 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.162189 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" event={"ID":"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a","Type":"ContainerStarted","Data":"b6bf344dd4f2aefa2c20b25879fadb442155e60ab90edc828050993020f12566"} Apr 23 13:32:34.163917 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.163891 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" event={"ID":"e7d94bc3-8733-4bd5-b1de-635975dfe4bd","Type":"ContainerStarted","Data":"a0c3cedd84891315bab61dfa801f887b58b7e9442e3d5562c7d58947a024fbb0"} Apr 23 13:32:34.164613 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.164074 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b994fd948-vpkgz"] Apr 23 13:32:34.166998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.166974 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vf74v"] Apr 23 13:32:34.167281 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:34.167241 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20030382_369a_4a7a_bdb3_477e6d873b00.slice/crio-824594d27d105359e8c874622f217cfef037651b7701ccdcc657bc07b6a78890 WatchSource:0}: Error finding container 824594d27d105359e8c874622f217cfef037651b7701ccdcc657bc07b6a78890: Status 404 returned error can't find the container with id 824594d27d105359e8c874622f217cfef037651b7701ccdcc657bc07b6a78890 Apr 23 13:32:34.169463 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:34.169440 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5880bb15_7341_40f4_a23b_983d2d71912f.slice/crio-1c6c1d84ad8c9cf60c0651b0aac8a7f2fb275627ff25e8f6a78d45a1bab8caae WatchSource:0}: Error finding container 1c6c1d84ad8c9cf60c0651b0aac8a7f2fb275627ff25e8f6a78d45a1bab8caae: Status 404 returned error can't find the container with id 1c6c1d84ad8c9cf60c0651b0aac8a7f2fb275627ff25e8f6a78d45a1bab8caae Apr 23 13:32:34.182045 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.182008 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697697656d-5zdsf" podStartSLOduration=57.181994475 podStartE2EDuration="57.181994475s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:34.181201183 +0000 UTC m=+64.919745148" watchObservedRunningTime="2026-04-23 13:32:34.181994475 +0000 UTC m=+64.920538440" Apr 23 13:32:34.200827 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.200802 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fzqps"] Apr 23 13:32:34.207166 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:34.207140 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a154f5a_c08f_4f54_b3d7_fea632c012c6.slice/crio-b36c68d9ecc33bf7fd430840b360883f169f0c900132762e924c2058fb88238c WatchSource:0}: Error finding container b36c68d9ecc33bf7fd430840b360883f169f0c900132762e924c2058fb88238c: Status 404 returned error can't find the container with id b36c68d9ecc33bf7fd430840b360883f169f0c900132762e924c2058fb88238c Apr 23 13:32:34.581452 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.581411 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:32:34.584238 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.584212 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d6b50d4-32de-4031-b4e3-a88d3ce08d4d-metrics-certs\") pod \"network-metrics-daemon-gdstf\" (UID: \"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d\") " pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:32:34.870609 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.870361 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfgql\"" Apr 23 13:32:34.878212 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:34.877810 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdstf" Apr 23 13:32:35.055428 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.055169 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gdstf"] Apr 23 13:32:35.174231 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.174172 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gdstf" event={"ID":"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d","Type":"ContainerStarted","Data":"f9ddceaf1386d0775030973df2441308956127059e5138b2b7c49a89d2aa2481"} Apr 23 13:32:35.175895 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.175863 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fzqps" event={"ID":"9a154f5a-c08f-4f54-b3d7-fea632c012c6","Type":"ContainerStarted","Data":"b36c68d9ecc33bf7fd430840b360883f169f0c900132762e924c2058fb88238c"} Apr 23 13:32:35.177245 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.177178 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vf74v" event={"ID":"5880bb15-7341-40f4-a23b-983d2d71912f","Type":"ContainerStarted","Data":"1c6c1d84ad8c9cf60c0651b0aac8a7f2fb275627ff25e8f6a78d45a1bab8caae"} Apr 23 13:32:35.181244 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.181172 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b994fd948-vpkgz" event={"ID":"20030382-369a-4a7a-bdb3-477e6d873b00","Type":"ContainerStarted","Data":"a03668359c4ba8934e1609b8312551e9a3d1165272eb6ab184398ac0b12b992e"} Apr 23 13:32:35.181244 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.181198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b994fd948-vpkgz" event={"ID":"20030382-369a-4a7a-bdb3-477e6d873b00","Type":"ContainerStarted","Data":"824594d27d105359e8c874622f217cfef037651b7701ccdcc657bc07b6a78890"} Apr 23 13:32:35.202224 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.201960 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b994fd948-vpkgz" podStartSLOduration=58.201945106 podStartE2EDuration="58.201945106s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:35.200229895 +0000 UTC m=+65.938773862" watchObservedRunningTime="2026-04-23 13:32:35.201945106 +0000 UTC m=+65.940489074" Apr 23 13:32:35.938275 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.938236 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:35.941210 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:35.941185 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:36.182738 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:36.182708 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:36.184108 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:36.184085 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b994fd948-vpkgz" Apr 23 13:32:38.703078 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.703016 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697697656d-5zdsf"] Apr 23 13:32:38.766063 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.766028 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9s6t5"] Apr 23 13:32:38.785717 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.785690 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-78d5f7b556-pkxnd"] Apr 23 13:32:38.785925 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.785906 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:38.788871 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.788803 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-785x5\"" Apr 23 13:32:38.788871 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.788816 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:32:38.788871 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.788815 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:32:38.801131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.801062 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.801713 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.801683 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9s6t5"] Apr 23 13:32:38.801830 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.801729 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78d5f7b556-pkxnd"] Apr 23 13:32:38.921262 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921223 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqx8c\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-kube-api-access-jqx8c\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921277 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3223e65f-a60d-40dc-895a-90af469a9129-crio-socket\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:38.921439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921310 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-registry-tls\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921339 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d29bc7d-2252-485d-b149-18b806d21365-registry-certificates\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921375 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-bound-sa-token\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921423 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d29bc7d-2252-485d-b149-18b806d21365-trusted-ca\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921455 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z28s\" (UniqueName: \"kubernetes.io/projected/3223e65f-a60d-40dc-895a-90af469a9129-kube-api-access-2z28s\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3223e65f-a60d-40dc-895a-90af469a9129-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921523 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d29bc7d-2252-485d-b149-18b806d21365-ca-trust-extracted\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921551 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d29bc7d-2252-485d-b149-18b806d21365-image-registry-private-configuration\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921579 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3223e65f-a60d-40dc-895a-90af469a9129-data-volume\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921607 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d29bc7d-2252-485d-b149-18b806d21365-installation-pull-secrets\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:38.921662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:38.921642 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3223e65f-a60d-40dc-895a-90af469a9129-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.022143 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022109 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-bound-sa-token\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.022266 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d29bc7d-2252-485d-b149-18b806d21365-trusted-ca\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.022266 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022201 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z28s\" (UniqueName: \"kubernetes.io/projected/3223e65f-a60d-40dc-895a-90af469a9129-kube-api-access-2z28s\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.022266 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022231 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3223e65f-a60d-40dc-895a-90af469a9129-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.022266 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022264 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d29bc7d-2252-485d-b149-18b806d21365-ca-trust-extracted\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.022440 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022293 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d29bc7d-2252-485d-b149-18b806d21365-image-registry-private-configuration\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.022440 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022321 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3223e65f-a60d-40dc-895a-90af469a9129-data-volume\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.022440 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022343 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d29bc7d-2252-485d-b149-18b806d21365-installation-pull-secrets\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.022440 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022379 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3223e65f-a60d-40dc-895a-90af469a9129-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.022440 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022411 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqx8c\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-kube-api-access-jqx8c\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.022651 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022446 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3223e65f-a60d-40dc-895a-90af469a9129-crio-socket\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.022651 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-registry-tls\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.024354 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.022825 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d29bc7d-2252-485d-b149-18b806d21365-registry-certificates\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.024354 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.023147 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3223e65f-a60d-40dc-895a-90af469a9129-crio-socket\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.024354 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.023198 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3223e65f-a60d-40dc-895a-90af469a9129-data-volume\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.024354 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.023589 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d29bc7d-2252-485d-b149-18b806d21365-ca-trust-extracted\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.024354 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.023615 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3223e65f-a60d-40dc-895a-90af469a9129-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.026537 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.025807 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d29bc7d-2252-485d-b149-18b806d21365-trusted-ca\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.028986 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.027741 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3223e65f-a60d-40dc-895a-90af469a9129-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.028986 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.027822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-registry-tls\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.028986 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.027883 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d29bc7d-2252-485d-b149-18b806d21365-registry-certificates\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.028986 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.028708 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d29bc7d-2252-485d-b149-18b806d21365-installation-pull-secrets\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.029401 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.029378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d29bc7d-2252-485d-b149-18b806d21365-image-registry-private-configuration\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.032879 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.032854 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-bound-sa-token\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.033080 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.033060 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z28s\" (UniqueName: \"kubernetes.io/projected/3223e65f-a60d-40dc-895a-90af469a9129-kube-api-access-2z28s\") pod \"insights-runtime-extractor-9s6t5\" (UID: \"3223e65f-a60d-40dc-895a-90af469a9129\") " pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.034740 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.034721 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqx8c\" (UniqueName: \"kubernetes.io/projected/3d29bc7d-2252-485d-b149-18b806d21365-kube-api-access-jqx8c\") pod \"image-registry-78d5f7b556-pkxnd\" (UID: \"3d29bc7d-2252-485d-b149-18b806d21365\") " pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.096960 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.096933 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9s6t5" Apr 23 13:32:39.111839 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.111818 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:39.194718 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.194658 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fzqps" event={"ID":"9a154f5a-c08f-4f54-b3d7-fea632c012c6","Type":"ContainerStarted","Data":"13388998077d833c67d439ae451f19937ed4099d7ffe1421c69b53349a3f2ffe"} Apr 23 13:32:39.301940 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.301844 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9s6t5"] Apr 23 13:32:39.305227 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:39.305181 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3223e65f_a60d_40dc_895a_90af469a9129.slice/crio-d73b90083c5df8dda08917d9a19f59801dfee61efdff0551d8be4b1f4dd706d1 WatchSource:0}: Error finding container d73b90083c5df8dda08917d9a19f59801dfee61efdff0551d8be4b1f4dd706d1: Status 404 returned error can't find the container with id d73b90083c5df8dda08917d9a19f59801dfee61efdff0551d8be4b1f4dd706d1 Apr 23 13:32:39.310278 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:39.309119 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78d5f7b556-pkxnd"] Apr 23 13:32:39.312498 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:39.312409 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d29bc7d_2252_485d_b149_18b806d21365.slice/crio-a4c93a3dc7c1b8cdd998e5fc601bba48799156e260cbd8d6b996d4398911c8a6 WatchSource:0}: Error finding container a4c93a3dc7c1b8cdd998e5fc601bba48799156e260cbd8d6b996d4398911c8a6: Status 404 returned error can't find the container with id a4c93a3dc7c1b8cdd998e5fc601bba48799156e260cbd8d6b996d4398911c8a6 Apr 23 13:32:40.199588 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.199550 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s6t5" event={"ID":"3223e65f-a60d-40dc-895a-90af469a9129","Type":"ContainerStarted","Data":"48bc5a49bd2c9b9f6cf216bc0a8548d605bcae8f90ea9574a651a73dde90d146"} Apr 23 13:32:40.199588 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.199589 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s6t5" event={"ID":"3223e65f-a60d-40dc-895a-90af469a9129","Type":"ContainerStarted","Data":"d73b90083c5df8dda08917d9a19f59801dfee61efdff0551d8be4b1f4dd706d1"} Apr 23 13:32:40.200922 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.200894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" event={"ID":"b3e8f6c3-e685-4e07-abe9-e57a6f11b37a","Type":"ContainerStarted","Data":"6f8818af59d99fdb53e9470573593672f8413ce28972ec0bf089e817dd484e3a"} Apr 23 13:32:40.202460 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.202432 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vf74v" event={"ID":"5880bb15-7341-40f4-a23b-983d2d71912f","Type":"ContainerStarted","Data":"87adae65695575dc4daff3999a7c93a147df9a6d4b210c8a0088659018681d1f"} Apr 23 13:32:40.202529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.202469 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vf74v" event={"ID":"5880bb15-7341-40f4-a23b-983d2d71912f","Type":"ContainerStarted","Data":"a8263fdc97ba58b9b7a0fd54e006cc4bab0348656279632e97197874a3b936f0"} Apr 23 13:32:40.202574 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.202542 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:40.203670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.203650 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" event={"ID":"cd536203-7ab7-44ff-86aa-4b70ff820188","Type":"ContainerStarted","Data":"b25e5d7ac0cff025ddaa41f666753032d7644865ba22318dbeb8a7d10d32e15e"} Apr 23 13:32:40.205190 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.205169 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" event={"ID":"e7d94bc3-8733-4bd5-b1de-635975dfe4bd","Type":"ContainerStarted","Data":"4e7de21c278e5f64898c20e64afa08f274850e052425d8fda1ebac0cad8f7e17"} Apr 23 13:32:40.205278 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.205194 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" event={"ID":"e7d94bc3-8733-4bd5-b1de-635975dfe4bd","Type":"ContainerStarted","Data":"ef1f21418a883a62156c36f086d828bdbe89aeb99f5eb80e5560e667ae8d5cf0"} Apr 23 13:32:40.206673 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.206650 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gdstf" event={"ID":"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d","Type":"ContainerStarted","Data":"1f85201fd8500911082baf1c4294bd5e5e2a6dfc8100856f1061ca65e22474d4"} Apr 23 13:32:40.206783 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.206676 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gdstf" event={"ID":"6d6b50d4-32de-4031-b4e3-a88d3ce08d4d","Type":"ContainerStarted","Data":"bf387ca48ed8b35e0cb7db01ef69486c1378c0ecf82e0fc95f60a46a7a2ddd32"} Apr 23 13:32:40.207877 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.207854 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" event={"ID":"3d29bc7d-2252-485d-b149-18b806d21365","Type":"ContainerStarted","Data":"69b6fc3409a5c7a4963be45045933cfbef83fad7399bb1ed2b8cb4e0caba0176"} Apr 23 13:32:40.207945 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.207886 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" event={"ID":"3d29bc7d-2252-485d-b149-18b806d21365","Type":"ContainerStarted","Data":"a4c93a3dc7c1b8cdd998e5fc601bba48799156e260cbd8d6b996d4398911c8a6"} Apr 23 13:32:40.208000 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.207988 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:32:40.267230 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.267182 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-txxls" podStartSLOduration=58.286052488 podStartE2EDuration="1m3.267169424s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:34.039089856 +0000 UTC m=+64.777633822" lastFinishedPulling="2026-04-23 13:32:39.020206814 +0000 UTC m=+69.758750758" observedRunningTime="2026-04-23 13:32:40.264933916 +0000 UTC m=+71.003477895" watchObservedRunningTime="2026-04-23 13:32:40.267169424 +0000 UTC m=+71.005713391" Apr 23 13:32:40.310869 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.310824 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrmk8" podStartSLOduration=58.261795831 podStartE2EDuration="1m3.310810703s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:33.97126058 +0000 UTC m=+64.709804539" lastFinishedPulling="2026-04-23 13:32:39.020275457 +0000 UTC m=+69.758819411" observedRunningTime="2026-04-23 13:32:40.309924747 +0000 UTC m=+71.048468735" watchObservedRunningTime="2026-04-23 13:32:40.310810703 +0000 UTC m=+71.049354669" Apr 23 13:32:40.346752 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.346708 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" podStartSLOduration=2.346693462 podStartE2EDuration="2.346693462s" podCreationTimestamp="2026-04-23 13:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:40.345261669 +0000 UTC m=+71.083805636" watchObservedRunningTime="2026-04-23 13:32:40.346693462 +0000 UTC m=+71.085237427" Apr 23 13:32:40.364281 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.364229 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-knl89" podStartSLOduration=44.4832077 podStartE2EDuration="49.364216249s" podCreationTimestamp="2026-04-23 13:31:51 +0000 UTC" firstStartedPulling="2026-04-23 13:32:34.139214578 +0000 UTC m=+64.877758529" lastFinishedPulling="2026-04-23 13:32:39.02022312 +0000 UTC m=+69.758767078" observedRunningTime="2026-04-23 13:32:40.363157851 +0000 UTC m=+71.101701817" watchObservedRunningTime="2026-04-23 13:32:40.364216249 +0000 UTC m=+71.102760268" Apr 23 13:32:40.382921 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.382878 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fzqps" podStartSLOduration=34.572063592 podStartE2EDuration="39.382865575s" podCreationTimestamp="2026-04-23 13:32:01 +0000 UTC" firstStartedPulling="2026-04-23 13:32:34.209422397 +0000 UTC m=+64.947966341" lastFinishedPulling="2026-04-23 13:32:39.020224362 +0000 UTC m=+69.758768324" observedRunningTime="2026-04-23 13:32:40.38243067 +0000 UTC m=+71.120974635" watchObservedRunningTime="2026-04-23 13:32:40.382865575 +0000 UTC m=+71.121409540" Apr 23 13:32:40.411207 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.411161 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vf74v" podStartSLOduration=34.562855749 podStartE2EDuration="39.411147407s" podCreationTimestamp="2026-04-23 13:32:01 +0000 UTC" firstStartedPulling="2026-04-23 13:32:34.17129791 +0000 UTC m=+64.909841853" lastFinishedPulling="2026-04-23 13:32:39.019589563 +0000 UTC m=+69.758133511" observedRunningTime="2026-04-23 13:32:40.410920815 +0000 UTC m=+71.149464781" watchObservedRunningTime="2026-04-23 13:32:40.411147407 +0000 UTC m=+71.149691374" Apr 23 13:32:40.432031 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:40.431989 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gdstf" podStartSLOduration=67.18086808300001 podStartE2EDuration="1m11.431975984s" podCreationTimestamp="2026-04-23 13:31:29 +0000 UTC" firstStartedPulling="2026-04-23 13:32:35.066099441 +0000 UTC m=+65.804643399" lastFinishedPulling="2026-04-23 13:32:39.317207351 +0000 UTC m=+70.055751300" observedRunningTime="2026-04-23 13:32:40.430186288 +0000 UTC m=+71.168730266" watchObservedRunningTime="2026-04-23 13:32:40.431975984 +0000 UTC m=+71.170519950" Apr 23 13:32:41.213163 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:41.213075 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s6t5" event={"ID":"3223e65f-a60d-40dc-895a-90af469a9129","Type":"ContainerStarted","Data":"86227891c89ca5587746555a9a651902b450e13c8bbd7238fb5eb00960e40543"} Apr 23 13:32:42.088784 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:42.088732 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hclwj" Apr 23 13:32:43.220747 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:43.220710 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s6t5" event={"ID":"3223e65f-a60d-40dc-895a-90af469a9129","Type":"ContainerStarted","Data":"313e6600543da59715b7803f0b38da7af8b1b048145143b7112bc47edf06bd28"} Apr 23 13:32:43.240564 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:43.240526 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9s6t5" podStartSLOduration=2.370617621 podStartE2EDuration="5.240513871s" podCreationTimestamp="2026-04-23 13:32:38 +0000 UTC" firstStartedPulling="2026-04-23 13:32:39.444919179 +0000 UTC m=+70.183463124" lastFinishedPulling="2026-04-23 13:32:42.314815429 +0000 UTC m=+73.053359374" observedRunningTime="2026-04-23 13:32:43.238784472 +0000 UTC m=+73.977328441" watchObservedRunningTime="2026-04-23 13:32:43.240513871 +0000 UTC m=+73.979057837" Apr 23 13:32:50.216288 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:50.216177 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vf74v" Apr 23 13:32:52.208063 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.208027 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xvwsk"] Apr 23 13:32:52.213452 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.213430 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.215877 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.215857 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:32:52.216127 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.216091 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-75lnf\"" Apr 23 13:32:52.216127 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.216107 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:32:52.216290 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.216152 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:32:52.217272 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.217255 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:32:52.330654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330613 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-sys\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.330859 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330672 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a7e8903-b596-4d37-8bf0-b654a520433b-metrics-client-ca\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.330859 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330799 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-root\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.330859 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330841 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-wtmp\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.330975 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330904 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phlhs\" (UniqueName: \"kubernetes.io/projected/8a7e8903-b596-4d37-8bf0-b654a520433b-kube-api-access-phlhs\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.330975 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330922 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-textfile\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.330975 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.330948 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.331069 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.331009 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-tls\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.331069 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.331033 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432049 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432016 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-sys\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432064 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a7e8903-b596-4d37-8bf0-b654a520433b-metrics-client-ca\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432093 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-root\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432122 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-wtmp\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432141 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-sys\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432171 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phlhs\" (UniqueName: \"kubernetes.io/projected/8a7e8903-b596-4d37-8bf0-b654a520433b-kube-api-access-phlhs\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432197 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-textfile\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-root\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432235 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432225 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432619 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432265 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-tls\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432619 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432282 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-wtmp\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432619 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432619 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:52.432360 2565 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:32:52.432619 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:32:52.432423 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-tls podName:8a7e8903-b596-4d37-8bf0-b654a520433b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:52.932403315 +0000 UTC m=+83.670947272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-tls") pod "node-exporter-xvwsk" (UID: "8a7e8903-b596-4d37-8bf0-b654a520433b") : secret "node-exporter-tls" not found Apr 23 13:32:52.432619 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432544 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-textfile\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432958 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a7e8903-b596-4d37-8bf0-b654a520433b-metrics-client-ca\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.432958 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.432917 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.435192 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.435172 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.440779 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.440694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phlhs\" (UniqueName: \"kubernetes.io/projected/8a7e8903-b596-4d37-8bf0-b654a520433b-kube-api-access-phlhs\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.936557 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.936508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-tls\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:52.938738 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:52.938718 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8a7e8903-b596-4d37-8bf0-b654a520433b-node-exporter-tls\") pod \"node-exporter-xvwsk\" (UID: \"8a7e8903-b596-4d37-8bf0-b654a520433b\") " pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:53.122312 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:53.122277 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xvwsk" Apr 23 13:32:53.131746 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:32:53.131718 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a7e8903_b596_4d37_8bf0_b654a520433b.slice/crio-e98e359e5182740a2346c6447d09b85b42bcdfaeedccc00a034a7b3760ee6af7 WatchSource:0}: Error finding container e98e359e5182740a2346c6447d09b85b42bcdfaeedccc00a034a7b3760ee6af7: Status 404 returned error can't find the container with id e98e359e5182740a2346c6447d09b85b42bcdfaeedccc00a034a7b3760ee6af7 Apr 23 13:32:53.249881 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:53.249804 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xvwsk" event={"ID":"8a7e8903-b596-4d37-8bf0-b654a520433b","Type":"ContainerStarted","Data":"e98e359e5182740a2346c6447d09b85b42bcdfaeedccc00a034a7b3760ee6af7"} Apr 23 13:32:54.253885 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:54.253808 2565 generic.go:358] "Generic (PLEG): container finished" podID="8a7e8903-b596-4d37-8bf0-b654a520433b" containerID="ca4dcc0bd826fc6b4d91947e9977f0037bcdfaad4f66bbe193166c553ac080ef" exitCode=0 Apr 23 13:32:54.254243 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:54.253899 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xvwsk" event={"ID":"8a7e8903-b596-4d37-8bf0-b654a520433b","Type":"ContainerDied","Data":"ca4dcc0bd826fc6b4d91947e9977f0037bcdfaad4f66bbe193166c553ac080ef"} Apr 23 13:32:55.257994 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:55.257955 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xvwsk" event={"ID":"8a7e8903-b596-4d37-8bf0-b654a520433b","Type":"ContainerStarted","Data":"94a49b8f8237a88e25c5e602636b3d33069d31a40dc92cca98e14b826cb5e621"} Apr 23 13:32:55.257994 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:55.257995 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xvwsk" event={"ID":"8a7e8903-b596-4d37-8bf0-b654a520433b","Type":"ContainerStarted","Data":"fef177f9b01afe34e49cc28b0efe0864718ff89415a318c5a4fc314a3d515ff6"} Apr 23 13:32:55.283482 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:55.283436 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xvwsk" podStartSLOduration=2.432979233 podStartE2EDuration="3.283424103s" podCreationTimestamp="2026-04-23 13:32:52 +0000 UTC" firstStartedPulling="2026-04-23 13:32:53.133885669 +0000 UTC m=+83.872429612" lastFinishedPulling="2026-04-23 13:32:53.984330534 +0000 UTC m=+84.722874482" observedRunningTime="2026-04-23 13:32:55.281777184 +0000 UTC m=+86.020321216" watchObservedRunningTime="2026-04-23 13:32:55.283424103 +0000 UTC m=+86.021968102" Apr 23 13:32:58.709381 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:32:58.709353 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:33:01.217496 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:01.217465 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-78d5f7b556-pkxnd" Apr 23 13:33:03.723299 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:03.723253 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697697656d-5zdsf" podUID="e2fed6c1-6174-4b2b-884a-12bca4486716" containerName="registry" containerID="cri-o://7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d" gracePeriod=30 Apr 23 13:33:03.956778 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:03.956738 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:33:04.010880 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.010798 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.010880 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.010842 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-certificates\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.010880 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.010863 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-installation-pull-secrets\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.011106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.010890 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf6lt\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-kube-api-access-wf6lt\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.011106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.010958 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2fed6c1-6174-4b2b-884a-12bca4486716-ca-trust-extracted\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.011106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.010988 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-trusted-ca\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.011106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.011022 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-bound-sa-token\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.011106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.011068 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-image-registry-private-configuration\") pod \"e2fed6c1-6174-4b2b-884a-12bca4486716\" (UID: \"e2fed6c1-6174-4b2b-884a-12bca4486716\") " Apr 23 13:33:04.011388 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.011332 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:04.011526 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.011479 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:04.013391 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.013367 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:04.013537 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.013452 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:04.013661 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.013644 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-kube-api-access-wf6lt" (OuterVolumeSpecName: "kube-api-access-wf6lt") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "kube-api-access-wf6lt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:04.013878 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.013850 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:04.013878 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.013855 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:04.019558 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.019533 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2fed6c1-6174-4b2b-884a-12bca4486716-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2fed6c1-6174-4b2b-884a-12bca4486716" (UID: "e2fed6c1-6174-4b2b-884a-12bca4486716"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:33:04.111793 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111748 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wf6lt\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-kube-api-access-wf6lt\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111793 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111787 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2fed6c1-6174-4b2b-884a-12bca4486716-ca-trust-extracted\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111793 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111798 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-trusted-ca\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111807 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-bound-sa-token\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111818 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-image-registry-private-configuration\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111827 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111836 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2fed6c1-6174-4b2b-884a-12bca4486716-registry-certificates\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.111998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.111844 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2fed6c1-6174-4b2b-884a-12bca4486716-installation-pull-secrets\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:33:04.284640 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.284559 2565 generic.go:358] "Generic (PLEG): container finished" podID="e2fed6c1-6174-4b2b-884a-12bca4486716" containerID="7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d" exitCode=0 Apr 23 13:33:04.284783 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.284650 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697697656d-5zdsf" Apr 23 13:33:04.284783 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.284646 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697697656d-5zdsf" event={"ID":"e2fed6c1-6174-4b2b-884a-12bca4486716","Type":"ContainerDied","Data":"7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d"} Apr 23 13:33:04.284783 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.284693 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697697656d-5zdsf" event={"ID":"e2fed6c1-6174-4b2b-884a-12bca4486716","Type":"ContainerDied","Data":"973bc633e0e6857ca4cfaa8ce7e341ede173eae3288f1a7dfddc08f6c8895bc1"} Apr 23 13:33:04.284783 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.284709 2565 scope.go:117] "RemoveContainer" containerID="7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d" Apr 23 13:33:04.292485 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.292467 2565 scope.go:117] "RemoveContainer" containerID="7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d" Apr 23 13:33:04.292787 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:33:04.292744 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d\": container with ID starting with 7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d not found: ID does not exist" containerID="7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d" Apr 23 13:33:04.292894 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.292792 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d"} err="failed to get container status \"7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d\": rpc error: code = NotFound desc = could not find container \"7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d\": container with ID starting with 7a3e48812e5215b271bb39481f8f5696cca9a5a7d4be7ecc69a820be6faf9f0d not found: ID does not exist" Apr 23 13:33:04.306130 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.306108 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697697656d-5zdsf"] Apr 23 13:33:04.309675 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:04.309649 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697697656d-5zdsf"] Apr 23 13:33:05.839780 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:05.839739 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2fed6c1-6174-4b2b-884a-12bca4486716" path="/var/lib/kubelet/pods/e2fed6c1-6174-4b2b-884a-12bca4486716/volumes" Apr 23 13:33:21.334922 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:21.334888 2565 generic.go:358] "Generic (PLEG): container finished" podID="512b3fcf-e8c1-4eb7-b755-9d8efa3083a5" containerID="306d976278ec086844073083ff1522576cca3fde13b4903c6a7ac844b3a13774" exitCode=0 Apr 23 13:33:21.335289 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:21.334965 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" event={"ID":"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5","Type":"ContainerDied","Data":"306d976278ec086844073083ff1522576cca3fde13b4903c6a7ac844b3a13774"} Apr 23 13:33:21.335331 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:21.335288 2565 scope.go:117] "RemoveContainer" containerID="306d976278ec086844073083ff1522576cca3fde13b4903c6a7ac844b3a13774" Apr 23 13:33:22.339906 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:22.339869 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7xk4g" event={"ID":"512b3fcf-e8c1-4eb7-b755-9d8efa3083a5","Type":"ContainerStarted","Data":"c74270c2becbc1af21d42d32f4ffab85aa89b162e822287f1cd0844b58dbbdde"} Apr 23 13:33:36.380513 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:36.380481 2565 generic.go:358] "Generic (PLEG): container finished" podID="334930fe-79d2-4d7d-9fd2-1c2db1eaf771" containerID="49158717f3939a570bf4a6b848bc2a6174b59676839813904bd2a486a83b4450" exitCode=0 Apr 23 13:33:36.380815 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:36.380549 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" event={"ID":"334930fe-79d2-4d7d-9fd2-1c2db1eaf771","Type":"ContainerDied","Data":"49158717f3939a570bf4a6b848bc2a6174b59676839813904bd2a486a83b4450"} Apr 23 13:33:36.380890 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:36.380876 2565 scope.go:117] "RemoveContainer" containerID="49158717f3939a570bf4a6b848bc2a6174b59676839813904bd2a486a83b4450" Apr 23 13:33:37.384367 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:37.384325 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c6hg5" event={"ID":"334930fe-79d2-4d7d-9fd2-1c2db1eaf771","Type":"ContainerStarted","Data":"eea54d35c3c46c2896e5dbdf150a9012ea19d8a8d8e79d31805b4f7a188e29d0"} Apr 23 13:33:41.396216 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:41.396183 2565 generic.go:358] "Generic (PLEG): container finished" podID="506e9f6c-b41d-4cad-9333-d952e8630ef9" containerID="581354dd5445c55f510232943ffcac2c3271538a8e0752f728381e8084b85f72" exitCode=0 Apr 23 13:33:41.396567 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:41.396255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" event={"ID":"506e9f6c-b41d-4cad-9333-d952e8630ef9","Type":"ContainerDied","Data":"581354dd5445c55f510232943ffcac2c3271538a8e0752f728381e8084b85f72"} Apr 23 13:33:41.396567 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:41.396555 2565 scope.go:117] "RemoveContainer" containerID="581354dd5445c55f510232943ffcac2c3271538a8e0752f728381e8084b85f72" Apr 23 13:33:42.400738 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:33:42.400696 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9flbj" event={"ID":"506e9f6c-b41d-4cad-9333-d952e8630ef9","Type":"ContainerStarted","Data":"7057f2e76478c5d78aae8fef9021af22c2198d5dce0cf2eaf9fdd8a0b7e0b321"} Apr 23 13:35:39.499188 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.499147 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx"] Apr 23 13:35:39.499749 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.499433 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2fed6c1-6174-4b2b-884a-12bca4486716" containerName="registry" Apr 23 13:35:39.499749 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.499449 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fed6c1-6174-4b2b-884a-12bca4486716" containerName="registry" Apr 23 13:35:39.499749 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.499517 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2fed6c1-6174-4b2b-884a-12bca4486716" containerName="registry" Apr 23 13:35:39.502435 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.502413 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.505021 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.504997 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:35:39.505152 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.505004 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:35:39.505152 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.505062 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-2wzh6\"" Apr 23 13:35:39.505305 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.505144 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:35:39.511322 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.511301 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx"] Apr 23 13:35:39.593072 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.593040 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2e79f612-3ebf-43ea-b096-12556978b1a3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v68fx\" (UID: \"2e79f612-3ebf-43ea-b096-12556978b1a3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.593214 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.593078 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcw2\" (UniqueName: \"kubernetes.io/projected/2e79f612-3ebf-43ea-b096-12556978b1a3-kube-api-access-jqcw2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v68fx\" (UID: \"2e79f612-3ebf-43ea-b096-12556978b1a3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.693731 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.693699 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2e79f612-3ebf-43ea-b096-12556978b1a3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v68fx\" (UID: \"2e79f612-3ebf-43ea-b096-12556978b1a3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.693859 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.693740 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcw2\" (UniqueName: \"kubernetes.io/projected/2e79f612-3ebf-43ea-b096-12556978b1a3-kube-api-access-jqcw2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v68fx\" (UID: \"2e79f612-3ebf-43ea-b096-12556978b1a3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.695992 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.695966 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2e79f612-3ebf-43ea-b096-12556978b1a3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v68fx\" (UID: \"2e79f612-3ebf-43ea-b096-12556978b1a3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.705107 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.705086 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcw2\" (UniqueName: \"kubernetes.io/projected/2e79f612-3ebf-43ea-b096-12556978b1a3-kube-api-access-jqcw2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v68fx\" (UID: \"2e79f612-3ebf-43ea-b096-12556978b1a3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.813632 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.813557 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:39.934774 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:39.934733 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx"] Apr 23 13:35:39.936711 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:35:39.936686 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e79f612_3ebf_43ea_b096_12556978b1a3.slice/crio-bdf3fd80ecc027a8e08e9dfa5a33f23a34faa16b9a6a70b23c2528cde683902c WatchSource:0}: Error finding container bdf3fd80ecc027a8e08e9dfa5a33f23a34faa16b9a6a70b23c2528cde683902c: Status 404 returned error can't find the container with id bdf3fd80ecc027a8e08e9dfa5a33f23a34faa16b9a6a70b23c2528cde683902c Apr 23 13:35:40.731846 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:40.731812 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" event={"ID":"2e79f612-3ebf-43ea-b096-12556978b1a3","Type":"ContainerStarted","Data":"bdf3fd80ecc027a8e08e9dfa5a33f23a34faa16b9a6a70b23c2528cde683902c"} Apr 23 13:35:46.703177 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.703142 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-g67fp"] Apr 23 13:35:46.706409 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.706392 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.709206 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.709185 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 13:35:46.709312 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.709187 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 13:35:46.709378 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.709304 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-nzzjx\"" Apr 23 13:35:46.714922 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.714899 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-g67fp"] Apr 23 13:35:46.740994 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.740967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcz4l\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-kube-api-access-qcz4l\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.741106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.741012 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.741106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.741065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c4036a60-192b-4213-83a0-e7ff36e3c8d4-cabundle0\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.752264 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.752240 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" event={"ID":"2e79f612-3ebf-43ea-b096-12556978b1a3","Type":"ContainerStarted","Data":"1eb23f0b40d1a894c1d828972d41581fc5fe0c67aff4a6b08a4b4e7d56952ae1"} Apr 23 13:35:46.752427 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.752416 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:35:46.770232 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.770186 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" podStartSLOduration=1.490500313 podStartE2EDuration="7.77017425s" podCreationTimestamp="2026-04-23 13:35:39 +0000 UTC" firstStartedPulling="2026-04-23 13:35:39.938827947 +0000 UTC m=+250.677371891" lastFinishedPulling="2026-04-23 13:35:46.218501884 +0000 UTC m=+256.957045828" observedRunningTime="2026-04-23 13:35:46.770032828 +0000 UTC m=+257.508576796" watchObservedRunningTime="2026-04-23 13:35:46.77017425 +0000 UTC m=+257.508718216" Apr 23 13:35:46.842126 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.842087 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcz4l\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-kube-api-access-qcz4l\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.842318 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.842137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.842318 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.842181 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c4036a60-192b-4213-83a0-e7ff36e3c8d4-cabundle0\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.842318 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:46.842305 2565 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 23 13:35:46.842478 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:46.842329 2565 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:35:46.842478 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:46.842339 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:35:46.842478 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:46.842355 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-g67fp: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 13:35:46.842478 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:46.842407 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates podName:c4036a60-192b-4213-83a0-e7ff36e3c8d4 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:47.342389486 +0000 UTC m=+258.080933444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates") pod "keda-operator-ffbb595cb-g67fp" (UID: "c4036a60-192b-4213-83a0-e7ff36e3c8d4") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 13:35:46.842963 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.842938 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c4036a60-192b-4213-83a0-e7ff36e3c8d4-cabundle0\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.851042 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.851018 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcz4l\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-kube-api-access-qcz4l\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:46.996923 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:46.996837 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp"] Apr 23 13:35:47.000287 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.000269 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.002854 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.002831 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 13:35:47.008189 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.008168 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp"] Apr 23 13:35:47.044382 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.044338 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.044515 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.044420 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vjk\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-kube-api-access-p6vjk\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.044515 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.044448 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c9366e86-bf5c-4b5e-bed8-71cd23a01453-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.145494 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.145461 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.145665 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.145524 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vjk\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-kube-api-access-p6vjk\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.145665 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.145561 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c9366e86-bf5c-4b5e-bed8-71cd23a01453-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.145665 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.145617 2565 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:35:47.145665 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.145635 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:35:47.145665 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.145655 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp: references non-existent secret key: tls.crt Apr 23 13:35:47.145937 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.145700 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates podName:c9366e86-bf5c-4b5e-bed8-71cd23a01453 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:47.645686653 +0000 UTC m=+258.384230597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates") pod "keda-metrics-apiserver-7c9f485588-kjlsp" (UID: "c9366e86-bf5c-4b5e-bed8-71cd23a01453") : references non-existent secret key: tls.crt Apr 23 13:35:47.146075 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.146043 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c9366e86-bf5c-4b5e-bed8-71cd23a01453-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.155308 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.155272 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vjk\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-kube-api-access-p6vjk\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.274457 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.274378 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-jmv2v"] Apr 23 13:35:47.279852 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.278815 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.282771 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.282728 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 13:35:47.287258 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.287233 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-jmv2v"] Apr 23 13:35:47.347704 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.347671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49-certificates\") pod \"keda-admission-cf49989db-jmv2v\" (UID: \"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49\") " pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.347869 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.347750 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgkg\" (UniqueName: \"kubernetes.io/projected/7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49-kube-api-access-5lgkg\") pod \"keda-admission-cf49989db-jmv2v\" (UID: \"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49\") " pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.347928 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.347864 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:47.347996 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.347983 2565 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:35:47.348031 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.347999 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:35:47.348031 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.348008 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-g67fp: references non-existent secret key: ca.crt Apr 23 13:35:47.348094 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.348053 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates podName:c4036a60-192b-4213-83a0-e7ff36e3c8d4 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:48.348040249 +0000 UTC m=+259.086584192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates") pod "keda-operator-ffbb595cb-g67fp" (UID: "c4036a60-192b-4213-83a0-e7ff36e3c8d4") : references non-existent secret key: ca.crt Apr 23 13:35:47.448961 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.448929 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49-certificates\") pod \"keda-admission-cf49989db-jmv2v\" (UID: \"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49\") " pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.449148 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.449066 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgkg\" (UniqueName: \"kubernetes.io/projected/7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49-kube-api-access-5lgkg\") pod \"keda-admission-cf49989db-jmv2v\" (UID: \"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49\") " pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.451511 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.451488 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49-certificates\") pod \"keda-admission-cf49989db-jmv2v\" (UID: \"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49\") " pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.457240 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.457214 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgkg\" (UniqueName: \"kubernetes.io/projected/7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49-kube-api-access-5lgkg\") pod \"keda-admission-cf49989db-jmv2v\" (UID: \"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49\") " pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.590411 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.590330 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:47.651961 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.651785 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:47.651961 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.651956 2565 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:35:47.652182 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.651973 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:35:47.652182 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.651997 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp: references non-existent secret key: tls.crt Apr 23 13:35:47.652182 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:47.652054 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates podName:c9366e86-bf5c-4b5e-bed8-71cd23a01453 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:48.652036203 +0000 UTC m=+259.390580150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates") pod "keda-metrics-apiserver-7c9f485588-kjlsp" (UID: "c9366e86-bf5c-4b5e-bed8-71cd23a01453") : references non-existent secret key: tls.crt Apr 23 13:35:47.711561 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.711540 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-jmv2v"] Apr 23 13:35:47.713278 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:35:47.713247 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8ecfca_b4a5_454f_a3dd_9dc1f591cb49.slice/crio-88ec272e3ee37855ec4ff769bf75b842b14bbab19fac9bd0c9e889256fcdeeec WatchSource:0}: Error finding container 88ec272e3ee37855ec4ff769bf75b842b14bbab19fac9bd0c9e889256fcdeeec: Status 404 returned error can't find the container with id 88ec272e3ee37855ec4ff769bf75b842b14bbab19fac9bd0c9e889256fcdeeec Apr 23 13:35:47.757353 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:47.757324 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-jmv2v" event={"ID":"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49","Type":"ContainerStarted","Data":"88ec272e3ee37855ec4ff769bf75b842b14bbab19fac9bd0c9e889256fcdeeec"} Apr 23 13:35:48.356939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:48.356906 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:48.357118 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.357066 2565 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:35:48.357118 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.357084 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:35:48.357118 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.357092 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-g67fp: references non-existent secret key: ca.crt Apr 23 13:35:48.357235 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.357147 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates podName:c4036a60-192b-4213-83a0-e7ff36e3c8d4 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:50.357131831 +0000 UTC m=+261.095675774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates") pod "keda-operator-ffbb595cb-g67fp" (UID: "c4036a60-192b-4213-83a0-e7ff36e3c8d4") : references non-existent secret key: ca.crt Apr 23 13:35:48.659072 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:48.658985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:48.659236 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.659141 2565 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:35:48.659236 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.659163 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:35:48.659236 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.659184 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp: references non-existent secret key: tls.crt Apr 23 13:35:48.659392 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:48.659249 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates podName:c9366e86-bf5c-4b5e-bed8-71cd23a01453 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:50.659227067 +0000 UTC m=+261.397771028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates") pod "keda-metrics-apiserver-7c9f485588-kjlsp" (UID: "c9366e86-bf5c-4b5e-bed8-71cd23a01453") : references non-existent secret key: tls.crt Apr 23 13:35:49.765365 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:49.765331 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-jmv2v" event={"ID":"7e8ecfca-b4a5-454f-a3dd-9dc1f591cb49","Type":"ContainerStarted","Data":"a96aec993f2f0e508c7cd0e8f9278c1bf9f8ae46e9f4229ffb4d779ade01ed6c"} Apr 23 13:35:49.765743 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:49.765428 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:35:49.781915 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:49.781872 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-jmv2v" podStartSLOduration=1.117961904 podStartE2EDuration="2.781860122s" podCreationTimestamp="2026-04-23 13:35:47 +0000 UTC" firstStartedPulling="2026-04-23 13:35:47.714543542 +0000 UTC m=+258.453087485" lastFinishedPulling="2026-04-23 13:35:49.378441744 +0000 UTC m=+260.116985703" observedRunningTime="2026-04-23 13:35:49.781285514 +0000 UTC m=+260.519829484" watchObservedRunningTime="2026-04-23 13:35:49.781860122 +0000 UTC m=+260.520404088" Apr 23 13:35:50.374626 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:50.374589 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:50.374807 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.374737 2565 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:35:50.374807 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.374771 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:35:50.374807 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.374780 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-g67fp: references non-existent secret key: ca.crt Apr 23 13:35:50.374944 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.374848 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates podName:c4036a60-192b-4213-83a0-e7ff36e3c8d4 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:54.374831817 +0000 UTC m=+265.113375761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates") pod "keda-operator-ffbb595cb-g67fp" (UID: "c4036a60-192b-4213-83a0-e7ff36e3c8d4") : references non-existent secret key: ca.crt Apr 23 13:35:50.677533 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:50.677496 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:50.677682 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.677639 2565 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:35:50.677682 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.677657 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:35:50.677682 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.677683 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp: references non-existent secret key: tls.crt Apr 23 13:35:50.677812 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:35:50.677733 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates podName:c9366e86-bf5c-4b5e-bed8-71cd23a01453 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:54.677720156 +0000 UTC m=+265.416264100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates") pod "keda-metrics-apiserver-7c9f485588-kjlsp" (UID: "c9366e86-bf5c-4b5e-bed8-71cd23a01453") : references non-existent secret key: tls.crt Apr 23 13:35:54.408950 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.408899 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:54.411332 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.411305 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c4036a60-192b-4213-83a0-e7ff36e3c8d4-certificates\") pod \"keda-operator-ffbb595cb-g67fp\" (UID: \"c4036a60-192b-4213-83a0-e7ff36e3c8d4\") " pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:54.517140 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.517105 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:54.636883 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.636854 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-g67fp"] Apr 23 13:35:54.638483 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:35:54.638455 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4036a60_192b_4213_83a0_e7ff36e3c8d4.slice/crio-e2ef75eb1b5b94243dfb361ac3dcaa6c352d84bc2e48ac3ea6090bc92c6496ff WatchSource:0}: Error finding container e2ef75eb1b5b94243dfb361ac3dcaa6c352d84bc2e48ac3ea6090bc92c6496ff: Status 404 returned error can't find the container with id e2ef75eb1b5b94243dfb361ac3dcaa6c352d84bc2e48ac3ea6090bc92c6496ff Apr 23 13:35:54.711637 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.711611 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:54.714027 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.714008 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9366e86-bf5c-4b5e-bed8-71cd23a01453-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kjlsp\" (UID: \"c9366e86-bf5c-4b5e-bed8-71cd23a01453\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:54.782296 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.782260 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" event={"ID":"c4036a60-192b-4213-83a0-e7ff36e3c8d4","Type":"ContainerStarted","Data":"e2ef75eb1b5b94243dfb361ac3dcaa6c352d84bc2e48ac3ea6090bc92c6496ff"} Apr 23 13:35:54.811629 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.811593 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:54.925392 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:54.925349 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp"] Apr 23 13:35:54.927421 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:35:54.927395 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9366e86_bf5c_4b5e_bed8_71cd23a01453.slice/crio-ebbd12f04aab970d796eb15c226bee20b9012119fafd9d212ee2fa021c62397c WatchSource:0}: Error finding container ebbd12f04aab970d796eb15c226bee20b9012119fafd9d212ee2fa021c62397c: Status 404 returned error can't find the container with id ebbd12f04aab970d796eb15c226bee20b9012119fafd9d212ee2fa021c62397c Apr 23 13:35:55.787801 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:55.787735 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" event={"ID":"c9366e86-bf5c-4b5e-bed8-71cd23a01453","Type":"ContainerStarted","Data":"ebbd12f04aab970d796eb15c226bee20b9012119fafd9d212ee2fa021c62397c"} Apr 23 13:35:58.805825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:58.805781 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" event={"ID":"c9366e86-bf5c-4b5e-bed8-71cd23a01453","Type":"ContainerStarted","Data":"8d4cb0f461b45799519aff2c963f38aa501bde62b90ecde8109ca204a1e02ade"} Apr 23 13:35:58.806253 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:58.806134 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:35:58.807257 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:58.807230 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" event={"ID":"c4036a60-192b-4213-83a0-e7ff36e3c8d4","Type":"ContainerStarted","Data":"a28b9f154c7056b5e9837b2705e169f0c8ac1b05a209d5271fb6ff7ae0cfef4b"} Apr 23 13:35:58.807384 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:58.807342 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:35:58.823471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:58.823403 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" podStartSLOduration=9.138913241 podStartE2EDuration="12.823391386s" podCreationTimestamp="2026-04-23 13:35:46 +0000 UTC" firstStartedPulling="2026-04-23 13:35:54.928842801 +0000 UTC m=+265.667386756" lastFinishedPulling="2026-04-23 13:35:58.613320943 +0000 UTC m=+269.351864901" observedRunningTime="2026-04-23 13:35:58.821802607 +0000 UTC m=+269.560346576" watchObservedRunningTime="2026-04-23 13:35:58.823391386 +0000 UTC m=+269.561935351" Apr 23 13:35:58.838098 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:35:58.838054 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" podStartSLOduration=8.859920902 podStartE2EDuration="12.838042877s" podCreationTimestamp="2026-04-23 13:35:46 +0000 UTC" firstStartedPulling="2026-04-23 13:35:54.639998655 +0000 UTC m=+265.378542609" lastFinishedPulling="2026-04-23 13:35:58.618120628 +0000 UTC m=+269.356664584" observedRunningTime="2026-04-23 13:35:58.83627465 +0000 UTC m=+269.574818617" watchObservedRunningTime="2026-04-23 13:35:58.838042877 +0000 UTC m=+269.576586843" Apr 23 13:36:07.759629 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:07.759596 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v68fx" Apr 23 13:36:09.814809 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:09.814779 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kjlsp" Apr 23 13:36:10.771396 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:10.771358 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-jmv2v" Apr 23 13:36:19.813082 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:19.813048 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-g67fp" Apr 23 13:36:29.730458 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:29.730426 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:36:29.730938 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:29.730555 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:36:29.740334 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:29.740298 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:36:49.941333 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.941298 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v"] Apr 23 13:36:49.944640 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.944620 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:49.947599 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.947578 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:36:49.948821 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.948801 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 13:36:49.948928 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.948849 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jb8z5\"" Apr 23 13:36:49.948928 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.948873 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:36:49.954525 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.954505 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v"] Apr 23 13:36:49.983515 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.983487 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59388ce3-3fdf-4929-aa70-04dc029a00e1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6ks5v\" (UID: \"59388ce3-3fdf-4929-aa70-04dc029a00e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:49.983667 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:49.983526 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh998\" (UniqueName: \"kubernetes.io/projected/59388ce3-3fdf-4929-aa70-04dc029a00e1-kube-api-access-nh998\") pod \"llmisvc-controller-manager-68cc5db7c4-6ks5v\" (UID: \"59388ce3-3fdf-4929-aa70-04dc029a00e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:50.084326 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.084295 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59388ce3-3fdf-4929-aa70-04dc029a00e1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6ks5v\" (UID: \"59388ce3-3fdf-4929-aa70-04dc029a00e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:50.084501 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.084344 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh998\" (UniqueName: \"kubernetes.io/projected/59388ce3-3fdf-4929-aa70-04dc029a00e1-kube-api-access-nh998\") pod \"llmisvc-controller-manager-68cc5db7c4-6ks5v\" (UID: \"59388ce3-3fdf-4929-aa70-04dc029a00e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:50.086808 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.086785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59388ce3-3fdf-4929-aa70-04dc029a00e1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6ks5v\" (UID: \"59388ce3-3fdf-4929-aa70-04dc029a00e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:50.093084 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.093053 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh998\" (UniqueName: \"kubernetes.io/projected/59388ce3-3fdf-4929-aa70-04dc029a00e1-kube-api-access-nh998\") pod \"llmisvc-controller-manager-68cc5db7c4-6ks5v\" (UID: \"59388ce3-3fdf-4929-aa70-04dc029a00e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:50.255588 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.255480 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:50.375374 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.375353 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v"] Apr 23 13:36:50.377481 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:36:50.377450 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod59388ce3_3fdf_4929_aa70_04dc029a00e1.slice/crio-2ae4a1225b9a176cbf8f71045771bcb8956e4014a71a4283af0dc615b27b7b8f WatchSource:0}: Error finding container 2ae4a1225b9a176cbf8f71045771bcb8956e4014a71a4283af0dc615b27b7b8f: Status 404 returned error can't find the container with id 2ae4a1225b9a176cbf8f71045771bcb8956e4014a71a4283af0dc615b27b7b8f Apr 23 13:36:50.378855 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.378831 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:36:50.970843 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:50.970811 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" event={"ID":"59388ce3-3fdf-4929-aa70-04dc029a00e1","Type":"ContainerStarted","Data":"2ae4a1225b9a176cbf8f71045771bcb8956e4014a71a4283af0dc615b27b7b8f"} Apr 23 13:36:52.983930 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:52.983891 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" event={"ID":"59388ce3-3fdf-4929-aa70-04dc029a00e1","Type":"ContainerStarted","Data":"f89978a440dd9ff482538a819bc52cbf7ce6e8eb390f188605b092c7322cd3e0"} Apr 23 13:36:52.984321 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:52.984120 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:36:53.002610 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:36:53.002561 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" podStartSLOduration=2.013395237 podStartE2EDuration="4.002547629s" podCreationTimestamp="2026-04-23 13:36:49 +0000 UTC" firstStartedPulling="2026-04-23 13:36:50.379019384 +0000 UTC m=+321.117563330" lastFinishedPulling="2026-04-23 13:36:52.368171765 +0000 UTC m=+323.106715722" observedRunningTime="2026-04-23 13:36:53.000626932 +0000 UTC m=+323.739170934" watchObservedRunningTime="2026-04-23 13:36:53.002547629 +0000 UTC m=+323.741091594" Apr 23 13:37:23.990224 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:37:23.990141 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6ks5v" Apr 23 13:38:25.812550 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.812514 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q"] Apr 23 13:38:25.815981 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.815961 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:25.819364 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.819346 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 13:38:25.819584 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.819564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qx2w\"" Apr 23 13:38:25.834054 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.834023 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q"] Apr 23 13:38:25.874656 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.874624 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ed558843-7b26-4337-8bbe-93d6b48fd601-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-tbx7q\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:25.874843 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.874683 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klztx\" (UniqueName: \"kubernetes.io/projected/ed558843-7b26-4337-8bbe-93d6b48fd601-kube-api-access-klztx\") pod \"seaweedfs-tls-custom-ddd4dbfd-tbx7q\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:25.975891 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.975855 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ed558843-7b26-4337-8bbe-93d6b48fd601-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-tbx7q\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:25.976077 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.975922 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klztx\" (UniqueName: \"kubernetes.io/projected/ed558843-7b26-4337-8bbe-93d6b48fd601-kube-api-access-klztx\") pod \"seaweedfs-tls-custom-ddd4dbfd-tbx7q\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:25.976255 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.976234 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ed558843-7b26-4337-8bbe-93d6b48fd601-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-tbx7q\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:25.986306 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:25.986282 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klztx\" (UniqueName: \"kubernetes.io/projected/ed558843-7b26-4337-8bbe-93d6b48fd601-kube-api-access-klztx\") pod \"seaweedfs-tls-custom-ddd4dbfd-tbx7q\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:26.124676 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:26.124582 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:26.450241 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:26.450215 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q"] Apr 23 13:38:26.452593 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:38:26.452566 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded558843_7b26_4337_8bbe_93d6b48fd601.slice/crio-fe43848aeee9669a8bbe62e3f220e8025ec3eba60a4c238738371aa891193b26 WatchSource:0}: Error finding container fe43848aeee9669a8bbe62e3f220e8025ec3eba60a4c238738371aa891193b26: Status 404 returned error can't find the container with id fe43848aeee9669a8bbe62e3f220e8025ec3eba60a4c238738371aa891193b26 Apr 23 13:38:27.287557 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:27.287519 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" event={"ID":"ed558843-7b26-4337-8bbe-93d6b48fd601","Type":"ContainerStarted","Data":"fe43848aeee9669a8bbe62e3f220e8025ec3eba60a4c238738371aa891193b26"} Apr 23 13:38:29.299403 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:29.299365 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" event={"ID":"ed558843-7b26-4337-8bbe-93d6b48fd601","Type":"ContainerStarted","Data":"3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca"} Apr 23 13:38:29.316814 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:29.316747 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" podStartSLOduration=1.666794774 podStartE2EDuration="4.316731954s" podCreationTimestamp="2026-04-23 13:38:25 +0000 UTC" firstStartedPulling="2026-04-23 13:38:26.454294936 +0000 UTC m=+417.192838880" lastFinishedPulling="2026-04-23 13:38:29.104232101 +0000 UTC m=+419.842776060" observedRunningTime="2026-04-23 13:38:29.315285377 +0000 UTC m=+420.053829344" watchObservedRunningTime="2026-04-23 13:38:29.316731954 +0000 UTC m=+420.055275920" Apr 23 13:38:30.004963 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:30.004920 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q"] Apr 23 13:38:31.305631 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:31.305566 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" podUID="ed558843-7b26-4337-8bbe-93d6b48fd601" containerName="seaweedfs-tls-custom" containerID="cri-o://3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca" gracePeriod=30 Apr 23 13:38:59.842620 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:59.842596 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:38:59.963542 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:59.963504 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klztx\" (UniqueName: \"kubernetes.io/projected/ed558843-7b26-4337-8bbe-93d6b48fd601-kube-api-access-klztx\") pod \"ed558843-7b26-4337-8bbe-93d6b48fd601\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " Apr 23 13:38:59.963745 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:59.963580 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ed558843-7b26-4337-8bbe-93d6b48fd601-data\") pod \"ed558843-7b26-4337-8bbe-93d6b48fd601\" (UID: \"ed558843-7b26-4337-8bbe-93d6b48fd601\") " Apr 23 13:38:59.964823 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:59.964796 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed558843-7b26-4337-8bbe-93d6b48fd601-data" (OuterVolumeSpecName: "data") pod "ed558843-7b26-4337-8bbe-93d6b48fd601" (UID: "ed558843-7b26-4337-8bbe-93d6b48fd601"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:38:59.965801 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:38:59.965778 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed558843-7b26-4337-8bbe-93d6b48fd601-kube-api-access-klztx" (OuterVolumeSpecName: "kube-api-access-klztx") pod "ed558843-7b26-4337-8bbe-93d6b48fd601" (UID: "ed558843-7b26-4337-8bbe-93d6b48fd601"). InnerVolumeSpecName "kube-api-access-klztx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:00.064778 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.064738 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klztx\" (UniqueName: \"kubernetes.io/projected/ed558843-7b26-4337-8bbe-93d6b48fd601-kube-api-access-klztx\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:39:00.064936 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.064789 2565 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ed558843-7b26-4337-8bbe-93d6b48fd601-data\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:39:00.400623 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.400542 2565 generic.go:358] "Generic (PLEG): container finished" podID="ed558843-7b26-4337-8bbe-93d6b48fd601" containerID="3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca" exitCode=0 Apr 23 13:39:00.400623 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.400602 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" Apr 23 13:39:00.400623 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.400612 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" event={"ID":"ed558843-7b26-4337-8bbe-93d6b48fd601","Type":"ContainerDied","Data":"3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca"} Apr 23 13:39:00.400876 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.400637 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q" event={"ID":"ed558843-7b26-4337-8bbe-93d6b48fd601","Type":"ContainerDied","Data":"fe43848aeee9669a8bbe62e3f220e8025ec3eba60a4c238738371aa891193b26"} Apr 23 13:39:00.400876 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.400654 2565 scope.go:117] "RemoveContainer" containerID="3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca" Apr 23 13:39:00.410137 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.410117 2565 scope.go:117] "RemoveContainer" containerID="3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca" Apr 23 13:39:00.410364 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:39:00.410346 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca\": container with ID starting with 3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca not found: ID does not exist" containerID="3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca" Apr 23 13:39:00.410410 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.410371 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca"} err="failed to get container status \"3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca\": rpc error: code = NotFound desc = could not find container \"3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca\": container with ID starting with 3c42727515ffe18f87e1e3590745a840ef4d496304c37971d3977062b08408ca not found: ID does not exist" Apr 23 13:39:00.421228 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.421204 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q"] Apr 23 13:39:00.424511 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.424490 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-tbx7q"] Apr 23 13:39:00.450289 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.450261 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj"] Apr 23 13:39:00.450576 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.450564 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed558843-7b26-4337-8bbe-93d6b48fd601" containerName="seaweedfs-tls-custom" Apr 23 13:39:00.450576 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.450577 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed558843-7b26-4337-8bbe-93d6b48fd601" containerName="seaweedfs-tls-custom" Apr 23 13:39:00.450648 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.450636 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed558843-7b26-4337-8bbe-93d6b48fd601" containerName="seaweedfs-tls-custom" Apr 23 13:39:00.455113 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.455095 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.457960 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.457923 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 13:39:00.458203 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.458117 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 23 13:39:00.458203 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.458150 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qx2w\"" Apr 23 13:39:00.460176 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.460158 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj"] Apr 23 13:39:00.568687 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.568651 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/f49064d1-2187-4ca9-9d04-0004a1321d9a-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.568907 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.568749 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f49064d1-2187-4ca9-9d04-0004a1321d9a-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.568907 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.568800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgjmn\" (UniqueName: \"kubernetes.io/projected/f49064d1-2187-4ca9-9d04-0004a1321d9a-kube-api-access-lgjmn\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.669696 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.669600 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f49064d1-2187-4ca9-9d04-0004a1321d9a-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.669696 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.669640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgjmn\" (UniqueName: \"kubernetes.io/projected/f49064d1-2187-4ca9-9d04-0004a1321d9a-kube-api-access-lgjmn\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.669941 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.669694 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/f49064d1-2187-4ca9-9d04-0004a1321d9a-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.670090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.670064 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f49064d1-2187-4ca9-9d04-0004a1321d9a-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.672184 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.672158 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/f49064d1-2187-4ca9-9d04-0004a1321d9a-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.678647 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.678609 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgjmn\" (UniqueName: \"kubernetes.io/projected/f49064d1-2187-4ca9-9d04-0004a1321d9a-kube-api-access-lgjmn\") pod \"seaweedfs-tls-custom-5c88b85bb7-5jtvj\" (UID: \"f49064d1-2187-4ca9-9d04-0004a1321d9a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.765001 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.764942 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" Apr 23 13:39:00.885206 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:00.885180 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj"] Apr 23 13:39:00.887290 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:39:00.887263 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49064d1_2187_4ca9_9d04_0004a1321d9a.slice/crio-8e63a66555619dcb71ea5fdb482b93648ff5686327dbc98e2f5e5e76e8cd3dd8 WatchSource:0}: Error finding container 8e63a66555619dcb71ea5fdb482b93648ff5686327dbc98e2f5e5e76e8cd3dd8: Status 404 returned error can't find the container with id 8e63a66555619dcb71ea5fdb482b93648ff5686327dbc98e2f5e5e76e8cd3dd8 Apr 23 13:39:01.405223 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:01.405134 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" event={"ID":"f49064d1-2187-4ca9-9d04-0004a1321d9a","Type":"ContainerStarted","Data":"2ee7920b73a753100792566870e445b0aea9d4573173ba05d2a38721c09fd3a8"} Apr 23 13:39:01.405223 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:01.405175 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" event={"ID":"f49064d1-2187-4ca9-9d04-0004a1321d9a","Type":"ContainerStarted","Data":"8e63a66555619dcb71ea5fdb482b93648ff5686327dbc98e2f5e5e76e8cd3dd8"} Apr 23 13:39:01.422633 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:01.422583 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5jtvj" podStartSLOduration=1.160292056 podStartE2EDuration="1.422569145s" podCreationTimestamp="2026-04-23 13:39:00 +0000 UTC" firstStartedPulling="2026-04-23 13:39:00.888536297 +0000 UTC m=+451.627080245" lastFinishedPulling="2026-04-23 13:39:01.150813376 +0000 UTC m=+451.889357334" observedRunningTime="2026-04-23 13:39:01.421509593 +0000 UTC m=+452.160053581" watchObservedRunningTime="2026-04-23 13:39:01.422569145 +0000 UTC m=+452.161113110" Apr 23 13:39:01.840454 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:01.840417 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed558843-7b26-4337-8bbe-93d6b48fd601" path="/var/lib/kubelet/pods/ed558843-7b26-4337-8bbe-93d6b48fd601/volumes" Apr 23 13:39:28.272071 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.272038 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l"] Apr 23 13:39:28.274554 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.274534 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.277666 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.277646 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 23 13:39:28.277791 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.277672 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 23 13:39:28.277791 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.277672 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:39:28.277906 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.277795 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:39:28.278966 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.278951 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:39:28.285390 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.285367 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l"] Apr 23 13:39:28.297969 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.297946 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1a47db5-2629-4686-aae3-519eb6d306a6-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.298068 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.297993 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1a47db5-2629-4686-aae3-519eb6d306a6-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.298068 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.298028 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7955j\" (UniqueName: \"kubernetes.io/projected/c1a47db5-2629-4686-aae3-519eb6d306a6-kube-api-access-7955j\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.298152 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.298069 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1a47db5-2629-4686-aae3-519eb6d306a6-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.399207 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.399178 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1a47db5-2629-4686-aae3-519eb6d306a6-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.399378 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.399227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1a47db5-2629-4686-aae3-519eb6d306a6-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.399378 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.399259 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1a47db5-2629-4686-aae3-519eb6d306a6-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.399378 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.399363 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7955j\" (UniqueName: \"kubernetes.io/projected/c1a47db5-2629-4686-aae3-519eb6d306a6-kube-api-access-7955j\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.399653 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.399629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1a47db5-2629-4686-aae3-519eb6d306a6-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.400012 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.399990 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1a47db5-2629-4686-aae3-519eb6d306a6-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.401737 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.401719 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1a47db5-2629-4686-aae3-519eb6d306a6-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.410043 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.410014 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7955j\" (UniqueName: \"kubernetes.io/projected/c1a47db5-2629-4686-aae3-519eb6d306a6-kube-api-access-7955j\") pod \"isvc-sklearn-batcher-predictor-6d65749c76-lkj5l\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.585637 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.585556 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:28.708900 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:28.708877 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l"] Apr 23 13:39:28.710893 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:39:28.710867 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a47db5_2629_4686_aae3_519eb6d306a6.slice/crio-cd75dc55079290236ba2eb8012e2cb3e1a8c69aba304014b4b30565b6b104b2d WatchSource:0}: Error finding container cd75dc55079290236ba2eb8012e2cb3e1a8c69aba304014b4b30565b6b104b2d: Status 404 returned error can't find the container with id cd75dc55079290236ba2eb8012e2cb3e1a8c69aba304014b4b30565b6b104b2d Apr 23 13:39:29.497554 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:29.497511 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerStarted","Data":"cd75dc55079290236ba2eb8012e2cb3e1a8c69aba304014b4b30565b6b104b2d"} Apr 23 13:39:33.513722 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:33.513681 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerStarted","Data":"508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee"} Apr 23 13:39:36.525186 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:36.525154 2565 generic.go:358] "Generic (PLEG): container finished" podID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerID="508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee" exitCode=0 Apr 23 13:39:36.525555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:36.525230 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerDied","Data":"508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee"} Apr 23 13:39:50.583175 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:50.583098 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerStarted","Data":"a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a"} Apr 23 13:39:53.596075 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:53.596036 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerStarted","Data":"237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40"} Apr 23 13:39:56.609790 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:56.609715 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerStarted","Data":"c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2"} Apr 23 13:39:56.610207 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:56.609984 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:56.610207 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:56.610123 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:56.611347 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:56.611305 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:39:56.631291 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:56.631252 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podStartSLOduration=1.79738997 podStartE2EDuration="28.631240241s" podCreationTimestamp="2026-04-23 13:39:28 +0000 UTC" firstStartedPulling="2026-04-23 13:39:28.71273743 +0000 UTC m=+479.451281374" lastFinishedPulling="2026-04-23 13:39:55.5465877 +0000 UTC m=+506.285131645" observedRunningTime="2026-04-23 13:39:56.628968953 +0000 UTC m=+507.367512941" watchObservedRunningTime="2026-04-23 13:39:56.631240241 +0000 UTC m=+507.369784207" Apr 23 13:39:57.613115 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:57.613077 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:57.613563 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:57.613133 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:39:57.614150 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:57.614124 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:39:57.616847 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:57.616829 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:39:58.616778 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:58.616735 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:39:58.617157 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:58.617095 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:39:59.620433 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:59.620396 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:39:59.620879 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:39:59.620703 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:09.621154 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:09.621107 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:40:09.621595 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:09.621567 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:19.620982 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:19.620884 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:40:19.621427 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:19.621265 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:29.620395 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:29.620345 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:40:29.620833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:29.620781 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:39.621131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:39.621082 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:40:39.621686 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:39.621504 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:49.620649 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:49.620602 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:40:49.621119 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:49.621090 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:59.620976 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:59.620944 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:40:59.621465 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:40:59.621385 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:41:13.288789 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.288722 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l"] Apr 23 13:41:13.289357 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.289080 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" containerID="cri-o://a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a" gracePeriod=30 Apr 23 13:41:13.289357 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.289139 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" containerID="cri-o://237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40" gracePeriod=30 Apr 23 13:41:13.289357 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.289298 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" containerID="cri-o://c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2" gracePeriod=30 Apr 23 13:41:13.383154 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.383121 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr"] Apr 23 13:41:13.386802 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.386784 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.389316 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.389294 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 23 13:41:13.389487 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.389473 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 23 13:41:13.394104 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.394063 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a95235-2116-42a2-ab57-2b8119e445a4-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.394224 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.394119 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.394224 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.394141 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns4bf\" (UniqueName: \"kubernetes.io/projected/36a95235-2116-42a2-ab57-2b8119e445a4-kube-api-access-ns4bf\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.394424 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.394314 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/36a95235-2116-42a2-ab57-2b8119e445a4-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.396620 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.396596 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr"] Apr 23 13:41:13.495186 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.495147 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a95235-2116-42a2-ab57-2b8119e445a4-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.495332 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.495193 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.495332 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.495215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns4bf\" (UniqueName: \"kubernetes.io/projected/36a95235-2116-42a2-ab57-2b8119e445a4-kube-api-access-ns4bf\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.495332 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.495265 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/36a95235-2116-42a2-ab57-2b8119e445a4-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.495332 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:41:13.495314 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-serving-cert: secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 23 13:41:13.495487 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:41:13.495388 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls podName:36a95235-2116-42a2-ab57-2b8119e445a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:41:13.995371292 +0000 UTC m=+584.733915248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls") pod "isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" (UID: "36a95235-2116-42a2-ab57-2b8119e445a4") : secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 23 13:41:13.495581 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.495558 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a95235-2116-42a2-ab57-2b8119e445a4-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.495941 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.495923 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/36a95235-2116-42a2-ab57-2b8119e445a4-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.504753 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.504733 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns4bf\" (UniqueName: \"kubernetes.io/projected/36a95235-2116-42a2-ab57-2b8119e445a4-kube-api-access-ns4bf\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:13.863048 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.863017 2565 generic.go:358] "Generic (PLEG): container finished" podID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerID="237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40" exitCode=2 Apr 23 13:41:13.863221 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:13.863084 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerDied","Data":"237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40"} Apr 23 13:41:14.000077 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:14.000044 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:14.002502 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:14.002479 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:14.298090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:14.298057 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:14.421514 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:14.421487 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr"] Apr 23 13:41:14.423487 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:41:14.423460 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a95235_2116_42a2_ab57_2b8119e445a4.slice/crio-1c345acca45780d907ea96ff02ef363b9699499ae78674273ee90e0284529da7 WatchSource:0}: Error finding container 1c345acca45780d907ea96ff02ef363b9699499ae78674273ee90e0284529da7: Status 404 returned error can't find the container with id 1c345acca45780d907ea96ff02ef363b9699499ae78674273ee90e0284529da7 Apr 23 13:41:14.868114 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:14.868082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerStarted","Data":"c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b"} Apr 23 13:41:14.868114 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:14.868117 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerStarted","Data":"1c345acca45780d907ea96ff02ef363b9699499ae78674273ee90e0284529da7"} Apr 23 13:41:17.614178 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:17.614070 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 23 13:41:17.882548 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:17.882521 2565 generic.go:358] "Generic (PLEG): container finished" podID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerID="a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a" exitCode=0 Apr 23 13:41:17.882676 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:17.882590 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerDied","Data":"a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a"} Apr 23 13:41:17.883700 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:17.883681 2565 generic.go:358] "Generic (PLEG): container finished" podID="36a95235-2116-42a2-ab57-2b8119e445a4" containerID="c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b" exitCode=0 Apr 23 13:41:17.883812 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:17.883730 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerDied","Data":"c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b"} Apr 23 13:41:18.888931 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.888896 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerStarted","Data":"c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea"} Apr 23 13:41:18.889417 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.888938 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerStarted","Data":"14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed"} Apr 23 13:41:18.889417 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.888954 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerStarted","Data":"df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3"} Apr 23 13:41:18.889417 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.889220 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:18.889417 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.889249 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:18.890821 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.890791 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:18.911305 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:18.911268 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podStartSLOduration=5.911255923 podStartE2EDuration="5.911255923s" podCreationTimestamp="2026-04-23 13:41:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:41:18.909202197 +0000 UTC m=+589.647746163" watchObservedRunningTime="2026-04-23 13:41:18.911255923 +0000 UTC m=+589.649799889" Apr 23 13:41:19.620629 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:19.620588 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:41:19.622227 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:19.622202 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:19.892338 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:19.892244 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:19.892715 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:19.892434 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:19.893378 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:19.893353 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:20.895699 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:20.895654 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:20.896165 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:20.896143 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:22.613983 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:22.613939 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 23 13:41:25.899998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:25.899971 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:41:25.900560 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:25.900534 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:25.900827 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:25.900799 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:27.613985 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:27.613947 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 23 13:41:27.614343 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:27.614066 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:41:29.620454 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:29.620411 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:41:29.621998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:29.621959 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:29.754735 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:29.754707 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:41:29.755922 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:29.755901 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:41:32.613747 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:32.613702 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 23 13:41:35.900495 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:35.900454 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:35.900907 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:35.900795 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:37.613724 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:37.613676 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 23 13:41:39.620839 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:39.620795 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 23 13:41:39.621208 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:39.620951 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:41:39.622324 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:39.622300 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:39.622429 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:39.622415 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:41:42.614082 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:42.614041 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 23 13:41:43.435529 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.435506 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:41:43.547930 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.547841 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1a47db5-2629-4686-aae3-519eb6d306a6-kserve-provision-location\") pod \"c1a47db5-2629-4686-aae3-519eb6d306a6\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " Apr 23 13:41:43.548093 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.547931 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1a47db5-2629-4686-aae3-519eb6d306a6-proxy-tls\") pod \"c1a47db5-2629-4686-aae3-519eb6d306a6\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " Apr 23 13:41:43.548093 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.547970 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7955j\" (UniqueName: \"kubernetes.io/projected/c1a47db5-2629-4686-aae3-519eb6d306a6-kube-api-access-7955j\") pod \"c1a47db5-2629-4686-aae3-519eb6d306a6\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " Apr 23 13:41:43.548093 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.547999 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1a47db5-2629-4686-aae3-519eb6d306a6-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"c1a47db5-2629-4686-aae3-519eb6d306a6\" (UID: \"c1a47db5-2629-4686-aae3-519eb6d306a6\") " Apr 23 13:41:43.548243 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.548160 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a47db5-2629-4686-aae3-519eb6d306a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c1a47db5-2629-4686-aae3-519eb6d306a6" (UID: "c1a47db5-2629-4686-aae3-519eb6d306a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:41:43.548297 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.548271 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1a47db5-2629-4686-aae3-519eb6d306a6-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:41:43.548369 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.548350 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a47db5-2629-4686-aae3-519eb6d306a6-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "c1a47db5-2629-4686-aae3-519eb6d306a6" (UID: "c1a47db5-2629-4686-aae3-519eb6d306a6"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:41:43.550088 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.550068 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a47db5-2629-4686-aae3-519eb6d306a6-kube-api-access-7955j" (OuterVolumeSpecName: "kube-api-access-7955j") pod "c1a47db5-2629-4686-aae3-519eb6d306a6" (UID: "c1a47db5-2629-4686-aae3-519eb6d306a6"). InnerVolumeSpecName "kube-api-access-7955j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:41:43.550088 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.550074 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a47db5-2629-4686-aae3-519eb6d306a6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c1a47db5-2629-4686-aae3-519eb6d306a6" (UID: "c1a47db5-2629-4686-aae3-519eb6d306a6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:41:43.649421 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.649393 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1a47db5-2629-4686-aae3-519eb6d306a6-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:41:43.649421 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.649416 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7955j\" (UniqueName: \"kubernetes.io/projected/c1a47db5-2629-4686-aae3-519eb6d306a6-kube-api-access-7955j\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:41:43.649792 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.649427 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1a47db5-2629-4686-aae3-519eb6d306a6-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:41:43.970887 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.970849 2565 generic.go:358] "Generic (PLEG): container finished" podID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerID="c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2" exitCode=0 Apr 23 13:41:43.971026 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.970892 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerDied","Data":"c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2"} Apr 23 13:41:43.971026 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.970932 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" event={"ID":"c1a47db5-2629-4686-aae3-519eb6d306a6","Type":"ContainerDied","Data":"cd75dc55079290236ba2eb8012e2cb3e1a8c69aba304014b4b30565b6b104b2d"} Apr 23 13:41:43.971026 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.970933 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l" Apr 23 13:41:43.971026 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.970948 2565 scope.go:117] "RemoveContainer" containerID="c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2" Apr 23 13:41:43.978921 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.978907 2565 scope.go:117] "RemoveContainer" containerID="237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40" Apr 23 13:41:43.985656 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.985639 2565 scope.go:117] "RemoveContainer" containerID="a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a" Apr 23 13:41:43.989928 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.989908 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l"] Apr 23 13:41:43.992610 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.992596 2565 scope.go:117] "RemoveContainer" containerID="508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee" Apr 23 13:41:43.995932 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.995912 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6d65749c76-lkj5l"] Apr 23 13:41:43.999526 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.999497 2565 scope.go:117] "RemoveContainer" containerID="c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2" Apr 23 13:41:43.999752 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:41:43.999734 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2\": container with ID starting with c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2 not found: ID does not exist" containerID="c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2" Apr 23 13:41:43.999833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.999778 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2"} err="failed to get container status \"c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2\": rpc error: code = NotFound desc = could not find container \"c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2\": container with ID starting with c5e837d049d0b979571afc37f02e5803a79dc0a89d963c9e09fcadab07e9f6d2 not found: ID does not exist" Apr 23 13:41:43.999833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:43.999803 2565 scope.go:117] "RemoveContainer" containerID="237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40" Apr 23 13:41:44.000047 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:41:44.000027 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40\": container with ID starting with 237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40 not found: ID does not exist" containerID="237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40" Apr 23 13:41:44.000090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:44.000052 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40"} err="failed to get container status \"237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40\": rpc error: code = NotFound desc = could not find container \"237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40\": container with ID starting with 237c2d0260d9f20a6c4eed4c44ea5bdfe56076069e1f08cff51e80b9e75c0f40 not found: ID does not exist" Apr 23 13:41:44.000090 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:44.000066 2565 scope.go:117] "RemoveContainer" containerID="a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a" Apr 23 13:41:44.000283 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:41:44.000265 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a\": container with ID starting with a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a not found: ID does not exist" containerID="a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a" Apr 23 13:41:44.000348 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:44.000291 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a"} err="failed to get container status \"a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a\": rpc error: code = NotFound desc = could not find container \"a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a\": container with ID starting with a2927404077abb36548faad500820cc4e2a424ab1c8f3324083a6e00ea45184a not found: ID does not exist" Apr 23 13:41:44.000348 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:44.000313 2565 scope.go:117] "RemoveContainer" containerID="508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee" Apr 23 13:41:44.000519 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:41:44.000504 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee\": container with ID starting with 508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee not found: ID does not exist" containerID="508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee" Apr 23 13:41:44.000553 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:44.000522 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee"} err="failed to get container status \"508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee\": rpc error: code = NotFound desc = could not find container \"508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee\": container with ID starting with 508ba92a5025df1d56042a78d8b89d5b6e8de02d78b6aef629808d822712e1ee not found: ID does not exist" Apr 23 13:41:45.840921 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:45.840879 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" path="/var/lib/kubelet/pods/c1a47db5-2629-4686-aae3-519eb6d306a6/volumes" Apr 23 13:41:45.900722 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:45.900689 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:45.901229 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:45.901204 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:55.901249 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:55.901210 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:41:55.903659 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:41:55.901579 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:05.900909 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:05.900868 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:42:05.901319 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:05.901297 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:15.900774 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:15.900716 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:42:15.901179 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:15.901086 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:25.900962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:25.900929 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:42:25.901456 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:25.901349 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:42:38.587439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:38.587407 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr"] Apr 23 13:42:38.588000 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:38.587910 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" containerID="cri-o://df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3" gracePeriod=30 Apr 23 13:42:38.588000 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:38.587933 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" containerID="cri-o://14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed" gracePeriod=30 Apr 23 13:42:38.588125 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:38.587933 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" containerID="cri-o://c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea" gracePeriod=30 Apr 23 13:42:39.148978 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:39.148943 2565 generic.go:358] "Generic (PLEG): container finished" podID="36a95235-2116-42a2-ab57-2b8119e445a4" containerID="14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed" exitCode=2 Apr 23 13:42:39.149144 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:39.149019 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerDied","Data":"14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed"} Apr 23 13:42:40.896799 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:40.896741 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 23 13:42:43.163656 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:43.163620 2565 generic.go:358] "Generic (PLEG): container finished" podID="36a95235-2116-42a2-ab57-2b8119e445a4" containerID="df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3" exitCode=0 Apr 23 13:42:43.164060 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:43.163693 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerDied","Data":"df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3"} Apr 23 13:42:45.896038 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:45.896001 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 23 13:42:45.901413 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:45.901374 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:42:45.901706 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:45.901681 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:50.896961 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:50.896915 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 23 13:42:50.897332 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:50.897082 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:42:55.896379 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:55.896333 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 23 13:42:55.900654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:55.900626 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:42:55.902276 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:55.902256 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:58.596106 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596070 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6"] Apr 23 13:42:58.596576 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596556 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="storage-initializer" Apr 23 13:42:58.596654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596579 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="storage-initializer" Apr 23 13:42:58.596654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596596 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" Apr 23 13:42:58.596654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596605 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" Apr 23 13:42:58.596654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596625 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" Apr 23 13:42:58.596654 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596634 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" Apr 23 13:42:58.596939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596662 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" Apr 23 13:42:58.596939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596672 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" Apr 23 13:42:58.596939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596794 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="agent" Apr 23 13:42:58.596939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596813 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kube-rbac-proxy" Apr 23 13:42:58.596939 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.596823 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1a47db5-2629-4686-aae3-519eb6d306a6" containerName="kserve-container" Apr 23 13:42:58.600116 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.600091 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.602673 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.602644 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 23 13:42:58.602806 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.602681 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 23 13:42:58.614126 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.614104 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6"] Apr 23 13:42:58.763438 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.763393 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59ab4d00-1c97-4c61-abab-39e184c7be82-kserve-provision-location\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.763438 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.763444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59ab4d00-1c97-4c61-abab-39e184c7be82-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.763656 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.763528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.763656 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.763571 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkdw\" (UniqueName: \"kubernetes.io/projected/59ab4d00-1c97-4c61-abab-39e184c7be82-kube-api-access-wwkdw\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.864364 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.864286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59ab4d00-1c97-4c61-abab-39e184c7be82-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.864364 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.864353 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.864539 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.864387 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkdw\" (UniqueName: \"kubernetes.io/projected/59ab4d00-1c97-4c61-abab-39e184c7be82-kube-api-access-wwkdw\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.864539 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.864420 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59ab4d00-1c97-4c61-abab-39e184c7be82-kserve-provision-location\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.864539 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:42:58.864523 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-predictor-serving-cert: secret "isvc-logger-predictor-serving-cert" not found Apr 23 13:42:58.864654 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:42:58.864598 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls podName:59ab4d00-1c97-4c61-abab-39e184c7be82 nodeName:}" failed. No retries permitted until 2026-04-23 13:42:59.364575511 +0000 UTC m=+690.103119468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls") pod "isvc-logger-predictor-7ffcf8d567-9khc6" (UID: "59ab4d00-1c97-4c61-abab-39e184c7be82") : secret "isvc-logger-predictor-serving-cert" not found Apr 23 13:42:58.864824 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.864805 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59ab4d00-1c97-4c61-abab-39e184c7be82-kserve-provision-location\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.864970 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.864953 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59ab4d00-1c97-4c61-abab-39e184c7be82-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:58.873250 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:58.873231 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkdw\" (UniqueName: \"kubernetes.io/projected/59ab4d00-1c97-4c61-abab-39e184c7be82-kube-api-access-wwkdw\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:59.366998 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:59.366967 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:59.369341 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:59.369318 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls\") pod \"isvc-logger-predictor-7ffcf8d567-9khc6\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:59.510825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:59.510793 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:42:59.632937 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:59.632914 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6"] Apr 23 13:42:59.635348 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:42:59.635320 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ab4d00_1c97_4c61_abab_39e184c7be82.slice/crio-11e18bdd84b24dfb8ee3bd3976baf7439fe523c002087aad221d0a9b4e7e62a5 WatchSource:0}: Error finding container 11e18bdd84b24dfb8ee3bd3976baf7439fe523c002087aad221d0a9b4e7e62a5: Status 404 returned error can't find the container with id 11e18bdd84b24dfb8ee3bd3976baf7439fe523c002087aad221d0a9b4e7e62a5 Apr 23 13:42:59.637685 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:42:59.637663 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:43:00.219969 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:00.219934 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerStarted","Data":"90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06"} Apr 23 13:43:00.219969 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:00.219970 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerStarted","Data":"11e18bdd84b24dfb8ee3bd3976baf7439fe523c002087aad221d0a9b4e7e62a5"} Apr 23 13:43:00.896077 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:00.896038 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 23 13:43:03.230902 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:03.230869 2565 generic.go:358] "Generic (PLEG): container finished" podID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerID="90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06" exitCode=0 Apr 23 13:43:03.231264 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:03.230944 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerDied","Data":"90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06"} Apr 23 13:43:04.237159 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.237123 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerStarted","Data":"61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3"} Apr 23 13:43:04.237652 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.237168 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerStarted","Data":"64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83"} Apr 23 13:43:04.237652 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.237184 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerStarted","Data":"3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7"} Apr 23 13:43:04.237652 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.237488 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:43:04.237652 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.237513 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:43:04.237652 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.237522 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:43:04.239213 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.239179 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:04.240024 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.240002 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:04.259499 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:04.259461 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podStartSLOduration=6.259449177 podStartE2EDuration="6.259449177s" podCreationTimestamp="2026-04-23 13:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:43:04.25826786 +0000 UTC m=+694.996811828" watchObservedRunningTime="2026-04-23 13:43:04.259449177 +0000 UTC m=+694.997993142" Apr 23 13:43:05.240837 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.240785 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:05.241279 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.241146 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:05.895870 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.895834 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 23 13:43:05.901203 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.901176 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:5000: connect: connection refused" Apr 23 13:43:05.901331 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.901316 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:43:05.901701 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.901676 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:05.901806 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:05.901783 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:43:08.736781 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.736739 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:43:08.850870 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.850790 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/36a95235-2116-42a2-ab57-2b8119e445a4-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"36a95235-2116-42a2-ab57-2b8119e445a4\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " Apr 23 13:43:08.850870 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.850851 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns4bf\" (UniqueName: \"kubernetes.io/projected/36a95235-2116-42a2-ab57-2b8119e445a4-kube-api-access-ns4bf\") pod \"36a95235-2116-42a2-ab57-2b8119e445a4\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " Apr 23 13:43:08.851051 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.850927 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls\") pod \"36a95235-2116-42a2-ab57-2b8119e445a4\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " Apr 23 13:43:08.851051 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.850983 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a95235-2116-42a2-ab57-2b8119e445a4-kserve-provision-location\") pod \"36a95235-2116-42a2-ab57-2b8119e445a4\" (UID: \"36a95235-2116-42a2-ab57-2b8119e445a4\") " Apr 23 13:43:08.851250 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.851193 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a95235-2116-42a2-ab57-2b8119e445a4-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "36a95235-2116-42a2-ab57-2b8119e445a4" (UID: "36a95235-2116-42a2-ab57-2b8119e445a4"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:43:08.851367 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.851332 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a95235-2116-42a2-ab57-2b8119e445a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "36a95235-2116-42a2-ab57-2b8119e445a4" (UID: "36a95235-2116-42a2-ab57-2b8119e445a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:43:08.852926 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.852897 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a95235-2116-42a2-ab57-2b8119e445a4-kube-api-access-ns4bf" (OuterVolumeSpecName: "kube-api-access-ns4bf") pod "36a95235-2116-42a2-ab57-2b8119e445a4" (UID: "36a95235-2116-42a2-ab57-2b8119e445a4"). InnerVolumeSpecName "kube-api-access-ns4bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:43:08.852926 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.852910 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "36a95235-2116-42a2-ab57-2b8119e445a4" (UID: "36a95235-2116-42a2-ab57-2b8119e445a4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:43:08.951821 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.951790 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a95235-2116-42a2-ab57-2b8119e445a4-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:43:08.951821 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.951816 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/36a95235-2116-42a2-ab57-2b8119e445a4-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:43:08.951821 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.951828 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ns4bf\" (UniqueName: \"kubernetes.io/projected/36a95235-2116-42a2-ab57-2b8119e445a4-kube-api-access-ns4bf\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:43:08.952069 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:08.951837 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36a95235-2116-42a2-ab57-2b8119e445a4-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:43:09.254825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.254793 2565 generic.go:358] "Generic (PLEG): container finished" podID="36a95235-2116-42a2-ab57-2b8119e445a4" containerID="c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea" exitCode=0 Apr 23 13:43:09.255022 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.254893 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerDied","Data":"c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea"} Apr 23 13:43:09.255022 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.254928 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" event={"ID":"36a95235-2116-42a2-ab57-2b8119e445a4","Type":"ContainerDied","Data":"1c345acca45780d907ea96ff02ef363b9699499ae78674273ee90e0284529da7"} Apr 23 13:43:09.255022 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.254949 2565 scope.go:117] "RemoveContainer" containerID="c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea" Apr 23 13:43:09.255022 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.254970 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr" Apr 23 13:43:09.263229 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.263211 2565 scope.go:117] "RemoveContainer" containerID="14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed" Apr 23 13:43:09.270132 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.270116 2565 scope.go:117] "RemoveContainer" containerID="df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3" Apr 23 13:43:09.276877 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.276837 2565 scope.go:117] "RemoveContainer" containerID="c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b" Apr 23 13:43:09.278412 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.278392 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr"] Apr 23 13:43:09.282155 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.282134 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-667c84d549-5nhcr"] Apr 23 13:43:09.284171 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284155 2565 scope.go:117] "RemoveContainer" containerID="c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea" Apr 23 13:43:09.284393 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:43:09.284375 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea\": container with ID starting with c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea not found: ID does not exist" containerID="c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea" Apr 23 13:43:09.284436 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284401 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea"} err="failed to get container status \"c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea\": rpc error: code = NotFound desc = could not find container \"c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea\": container with ID starting with c5791553479e87b709b351ae6299f07138c864cbc0673e9172b24e3cf74183ea not found: ID does not exist" Apr 23 13:43:09.284436 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284420 2565 scope.go:117] "RemoveContainer" containerID="14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed" Apr 23 13:43:09.284655 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:43:09.284637 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed\": container with ID starting with 14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed not found: ID does not exist" containerID="14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed" Apr 23 13:43:09.284696 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284662 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed"} err="failed to get container status \"14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed\": rpc error: code = NotFound desc = could not find container \"14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed\": container with ID starting with 14cc8412cccaf6d4a039598c1700f35be6cd943affda0110b09e53a013fd97ed not found: ID does not exist" Apr 23 13:43:09.284696 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284677 2565 scope.go:117] "RemoveContainer" containerID="df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3" Apr 23 13:43:09.284917 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:43:09.284902 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3\": container with ID starting with df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3 not found: ID does not exist" containerID="df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3" Apr 23 13:43:09.284967 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284919 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3"} err="failed to get container status \"df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3\": rpc error: code = NotFound desc = could not find container \"df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3\": container with ID starting with df985b45dc7ca27497c6f103d4c31117b3bc6a5c82d9e756aa6e4a547ee6b3d3 not found: ID does not exist" Apr 23 13:43:09.284967 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.284932 2565 scope.go:117] "RemoveContainer" containerID="c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b" Apr 23 13:43:09.285147 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:43:09.285132 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b\": container with ID starting with c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b not found: ID does not exist" containerID="c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b" Apr 23 13:43:09.285191 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.285152 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b"} err="failed to get container status \"c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b\": rpc error: code = NotFound desc = could not find container \"c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b\": container with ID starting with c27ecc437bde6a1f231be33f3bb64e61eb174451cc4449ca26a4cfe6d070737b not found: ID does not exist" Apr 23 13:43:09.840740 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:09.840703 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" path="/var/lib/kubelet/pods/36a95235-2116-42a2-ab57-2b8119e445a4/volumes" Apr 23 13:43:10.245282 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:10.245254 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:43:10.245869 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:10.245836 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:10.246167 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:10.246146 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:20.246819 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:20.246710 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:20.247273 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:20.247164 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:30.246557 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:30.246518 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:30.247098 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:30.247073 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:40.246207 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:40.246163 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:40.246695 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:40.246633 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:50.246817 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:50.246768 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:43:50.247277 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:43:50.247253 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:00.246098 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:00.246056 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:44:00.246561 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:00.246544 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:10.246539 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:10.246504 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:10.246937 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:10.246717 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:23.775230 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:23.775188 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6"] Apr 23 13:44:23.775780 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:23.775696 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" containerID="cri-o://61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3" gracePeriod=30 Apr 23 13:44:23.775913 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:23.775705 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" containerID="cri-o://3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7" gracePeriod=30 Apr 23 13:44:23.775913 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:23.775881 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" containerID="cri-o://64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83" gracePeriod=30 Apr 23 13:44:24.520314 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:24.520282 2565 generic.go:358] "Generic (PLEG): container finished" podID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerID="64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83" exitCode=2 Apr 23 13:44:24.520480 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:24.520318 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerDied","Data":"64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83"} Apr 23 13:44:25.241915 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:25.241868 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 23 13:44:28.535052 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:28.535021 2565 generic.go:358] "Generic (PLEG): container finished" podID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerID="3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7" exitCode=0 Apr 23 13:44:28.535410 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:28.535097 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerDied","Data":"3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7"} Apr 23 13:44:30.241867 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:30.241826 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 23 13:44:30.246102 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:30.246069 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:44:30.246390 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:30.246367 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:35.241425 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:35.241385 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 23 13:44:35.241827 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:35.241521 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:40.241066 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:40.241024 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 23 13:44:40.246411 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:40.246378 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:44:40.246725 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:40.246700 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:45.241592 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:45.241553 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 23 13:44:50.241387 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:50.241297 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 23 13:44:50.246665 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:50.246639 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 23 13:44:50.246810 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:50.246797 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:50.246999 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:50.246983 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:50.247071 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:50.247060 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:53.966248 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:53.966226 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:54.111825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.111716 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls\") pod \"59ab4d00-1c97-4c61-abab-39e184c7be82\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " Apr 23 13:44:54.111976 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.111840 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59ab4d00-1c97-4c61-abab-39e184c7be82-isvc-logger-kube-rbac-proxy-sar-config\") pod \"59ab4d00-1c97-4c61-abab-39e184c7be82\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " Apr 23 13:44:54.111976 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.111870 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59ab4d00-1c97-4c61-abab-39e184c7be82-kserve-provision-location\") pod \"59ab4d00-1c97-4c61-abab-39e184c7be82\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " Apr 23 13:44:54.111976 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.111894 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwkdw\" (UniqueName: \"kubernetes.io/projected/59ab4d00-1c97-4c61-abab-39e184c7be82-kube-api-access-wwkdw\") pod \"59ab4d00-1c97-4c61-abab-39e184c7be82\" (UID: \"59ab4d00-1c97-4c61-abab-39e184c7be82\") " Apr 23 13:44:54.112243 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.112209 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ab4d00-1c97-4c61-abab-39e184c7be82-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59ab4d00-1c97-4c61-abab-39e184c7be82" (UID: "59ab4d00-1c97-4c61-abab-39e184c7be82"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:54.112243 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.112228 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ab4d00-1c97-4c61-abab-39e184c7be82-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "59ab4d00-1c97-4c61-abab-39e184c7be82" (UID: "59ab4d00-1c97-4c61-abab-39e184c7be82"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:54.113799 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.113754 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59ab4d00-1c97-4c61-abab-39e184c7be82" (UID: "59ab4d00-1c97-4c61-abab-39e184c7be82"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:54.113876 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.113812 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ab4d00-1c97-4c61-abab-39e184c7be82-kube-api-access-wwkdw" (OuterVolumeSpecName: "kube-api-access-wwkdw") pod "59ab4d00-1c97-4c61-abab-39e184c7be82" (UID: "59ab4d00-1c97-4c61-abab-39e184c7be82"). InnerVolumeSpecName "kube-api-access-wwkdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:54.212630 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.212606 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59ab4d00-1c97-4c61-abab-39e184c7be82-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:44:54.212630 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.212625 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59ab4d00-1c97-4c61-abab-39e184c7be82-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:44:54.212806 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.212635 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59ab4d00-1c97-4c61-abab-39e184c7be82-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:44:54.212806 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.212646 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwkdw\" (UniqueName: \"kubernetes.io/projected/59ab4d00-1c97-4c61-abab-39e184c7be82-kube-api-access-wwkdw\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:44:54.620204 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.620166 2565 generic.go:358] "Generic (PLEG): container finished" podID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerID="61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3" exitCode=137 Apr 23 13:44:54.620375 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.620252 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" Apr 23 13:44:54.620375 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.620256 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerDied","Data":"61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3"} Apr 23 13:44:54.620375 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.620298 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6" event={"ID":"59ab4d00-1c97-4c61-abab-39e184c7be82","Type":"ContainerDied","Data":"11e18bdd84b24dfb8ee3bd3976baf7439fe523c002087aad221d0a9b4e7e62a5"} Apr 23 13:44:54.620375 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.620316 2565 scope.go:117] "RemoveContainer" containerID="61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3" Apr 23 13:44:54.628585 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.628569 2565 scope.go:117] "RemoveContainer" containerID="64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83" Apr 23 13:44:54.635518 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.635499 2565 scope.go:117] "RemoveContainer" containerID="3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7" Apr 23 13:44:54.642970 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.642949 2565 scope.go:117] "RemoveContainer" containerID="90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06" Apr 23 13:44:54.643240 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.643223 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6"] Apr 23 13:44:54.648051 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.648028 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7ffcf8d567-9khc6"] Apr 23 13:44:54.652253 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.652234 2565 scope.go:117] "RemoveContainer" containerID="61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3" Apr 23 13:44:54.652510 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:44:54.652491 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3\": container with ID starting with 61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3 not found: ID does not exist" containerID="61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3" Apr 23 13:44:54.652564 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.652517 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3"} err="failed to get container status \"61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3\": rpc error: code = NotFound desc = could not find container \"61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3\": container with ID starting with 61d93edaaf4bc19d0e0a59f1c412066cd90a06f607c19fc94915039911dea4c3 not found: ID does not exist" Apr 23 13:44:54.652564 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.652532 2565 scope.go:117] "RemoveContainer" containerID="64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83" Apr 23 13:44:54.652805 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:44:54.652778 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83\": container with ID starting with 64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83 not found: ID does not exist" containerID="64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83" Apr 23 13:44:54.652868 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.652813 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83"} err="failed to get container status \"64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83\": rpc error: code = NotFound desc = could not find container \"64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83\": container with ID starting with 64c458b2999824e2c40964726f455c214a5af1725161e41b8fbcc0d1cf67cc83 not found: ID does not exist" Apr 23 13:44:54.652868 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.652830 2565 scope.go:117] "RemoveContainer" containerID="3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7" Apr 23 13:44:54.653086 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:44:54.653067 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7\": container with ID starting with 3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7 not found: ID does not exist" containerID="3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7" Apr 23 13:44:54.653146 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.653092 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7"} err="failed to get container status \"3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7\": rpc error: code = NotFound desc = could not find container \"3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7\": container with ID starting with 3dc1b2a6068c203ef3635cbe08c96f1617463cb543af81c0d1a694f6dd25f3a7 not found: ID does not exist" Apr 23 13:44:54.653146 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.653112 2565 scope.go:117] "RemoveContainer" containerID="90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06" Apr 23 13:44:54.653397 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:44:54.653377 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06\": container with ID starting with 90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06 not found: ID does not exist" containerID="90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06" Apr 23 13:44:54.653453 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:54.653405 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06"} err="failed to get container status \"90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06\": rpc error: code = NotFound desc = could not find container \"90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06\": container with ID starting with 90eac43d171b42de9c771f6f51eda87d1beb194dd1f42a42f3951ba9d29c9c06 not found: ID does not exist" Apr 23 13:44:55.840630 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:44:55.840586 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" path="/var/lib/kubelet/pods/59ab4d00-1c97-4c61-abab-39e184c7be82/volumes" Apr 23 13:46:29.775301 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:46:29.775224 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:46:29.777436 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:46:29.777408 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:51:29.796640 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:29.796610 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:51:29.798812 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:29.798789 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:51:55.281471 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281439 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds"] Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281797 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281812 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281824 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281830 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281839 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="storage-initializer" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281847 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="storage-initializer" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281885 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281894 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281910 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281918 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281929 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="storage-initializer" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281937 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="storage-initializer" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281944 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" Apr 23 13:51:55.281951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281953 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281961 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.281969 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.282089 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="agent" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.282106 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kserve-container" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.282120 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kube-rbac-proxy" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.282129 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="kserve-container" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.282138 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="59ab4d00-1c97-4c61-abab-39e184c7be82" containerName="kube-rbac-proxy" Apr 23 13:51:55.282479 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.282149 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="36a95235-2116-42a2-ab57-2b8119e445a4" containerName="agent" Apr 23 13:51:55.285380 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.285362 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.288067 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.288047 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 23 13:51:55.288176 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.288088 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:51:55.288176 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.288168 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 23 13:51:55.288290 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.288181 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:51:55.288412 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.288396 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:51:55.294940 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.294919 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds"] Apr 23 13:51:55.335863 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.335835 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a857907e-cba2-4ba8-8d3b-385106b8392b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.335990 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.335872 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a857907e-cba2-4ba8-8d3b-385106b8392b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.336035 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.335980 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmc7w\" (UniqueName: \"kubernetes.io/projected/a857907e-cba2-4ba8-8d3b-385106b8392b-kube-api-access-fmc7w\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.336035 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.336012 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a857907e-cba2-4ba8-8d3b-385106b8392b-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.436823 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.436793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a857907e-cba2-4ba8-8d3b-385106b8392b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.436990 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.436896 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmc7w\" (UniqueName: \"kubernetes.io/projected/a857907e-cba2-4ba8-8d3b-385106b8392b-kube-api-access-fmc7w\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.436990 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.436927 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a857907e-cba2-4ba8-8d3b-385106b8392b-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.436990 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.436963 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a857907e-cba2-4ba8-8d3b-385106b8392b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.437243 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.437217 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a857907e-cba2-4ba8-8d3b-385106b8392b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.437597 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.437575 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a857907e-cba2-4ba8-8d3b-385106b8392b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.439477 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.439453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a857907e-cba2-4ba8-8d3b-385106b8392b-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.444664 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.444634 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmc7w\" (UniqueName: \"kubernetes.io/projected/a857907e-cba2-4ba8-8d3b-385106b8392b-kube-api-access-fmc7w\") pod \"isvc-sklearn-mcp-predictor-b76fc9db7-66dds\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.595467 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.595379 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:51:55.718213 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.718189 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds"] Apr 23 13:51:55.720258 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:51:55.720230 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda857907e_cba2_4ba8_8d3b_385106b8392b.slice/crio-ae510c1c59969f9683f709a6ff63fc2d9492ba9c05f09628167eaaf6444f98fa WatchSource:0}: Error finding container ae510c1c59969f9683f709a6ff63fc2d9492ba9c05f09628167eaaf6444f98fa: Status 404 returned error can't find the container with id ae510c1c59969f9683f709a6ff63fc2d9492ba9c05f09628167eaaf6444f98fa Apr 23 13:51:55.722269 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.722250 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:51:55.961123 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.961084 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerStarted","Data":"76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01"} Apr 23 13:51:55.961123 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:55.961125 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerStarted","Data":"ae510c1c59969f9683f709a6ff63fc2d9492ba9c05f09628167eaaf6444f98fa"} Apr 23 13:51:59.975080 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:59.975048 2565 generic.go:358] "Generic (PLEG): container finished" podID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerID="76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01" exitCode=0 Apr 23 13:51:59.975483 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:51:59.975105 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerDied","Data":"76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01"} Apr 23 13:52:00.981673 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:00.981636 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerStarted","Data":"08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6"} Apr 23 13:52:01.986946 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:01.986863 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerStarted","Data":"cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463"} Apr 23 13:52:01.986946 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:01.986896 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerStarted","Data":"335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3"} Apr 23 13:52:01.987341 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:01.986995 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:52:01.987341 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:01.987121 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:52:01.987341 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:01.987142 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:52:02.010009 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:02.009965 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podStartSLOduration=5.346658822 podStartE2EDuration="7.009953042s" podCreationTimestamp="2026-04-23 13:51:55 +0000 UTC" firstStartedPulling="2026-04-23 13:52:00.041599835 +0000 UTC m=+1230.780143779" lastFinishedPulling="2026-04-23 13:52:01.704894052 +0000 UTC m=+1232.443437999" observedRunningTime="2026-04-23 13:52:02.007481487 +0000 UTC m=+1232.746025476" watchObservedRunningTime="2026-04-23 13:52:02.009953042 +0000 UTC m=+1232.748497007" Apr 23 13:52:07.995685 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:07.995651 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:52:37.997732 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:52:37.997697 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:53:07.999196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:07.999165 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:53:15.352065 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:15.352017 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds"] Apr 23 13:53:15.352668 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:15.352622 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" containerID="cri-o://08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6" gracePeriod=30 Apr 23 13:53:15.352795 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:15.352707 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-agent" containerID="cri-o://335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3" gracePeriod=30 Apr 23 13:53:15.352951 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:15.352906 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" containerID="cri-o://cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463" gracePeriod=30 Apr 23 13:53:16.229169 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:16.229132 2565 generic.go:358] "Generic (PLEG): container finished" podID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerID="cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463" exitCode=2 Apr 23 13:53:16.229363 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:16.229189 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerDied","Data":"cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463"} Apr 23 13:53:17.991049 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:17.991009 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 23 13:53:17.996433 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:17.996410 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.32:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.32:8080: connect: connection refused" Apr 23 13:53:18.238454 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:18.238418 2565 generic.go:358] "Generic (PLEG): container finished" podID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerID="08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6" exitCode=0 Apr 23 13:53:18.238624 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:18.238487 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerDied","Data":"08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6"} Apr 23 13:53:22.991333 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:22.991289 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 23 13:53:27.991584 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:27.991544 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 23 13:53:27.992027 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:27.991662 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:53:27.996250 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:27.996219 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.32:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.32:8080: connect: connection refused" Apr 23 13:53:32.991536 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:32.991494 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 23 13:53:37.991825 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:37.991776 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 23 13:53:37.996165 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:37.996138 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.32:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.32:8080: connect: connection refused" Apr 23 13:53:37.996259 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:37.996248 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:53:42.991721 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:42.991685 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 23 13:53:45.549718 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.549694 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:53:45.711565 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.711519 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a857907e-cba2-4ba8-8d3b-385106b8392b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"a857907e-cba2-4ba8-8d3b-385106b8392b\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " Apr 23 13:53:45.711781 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.711586 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a857907e-cba2-4ba8-8d3b-385106b8392b-proxy-tls\") pod \"a857907e-cba2-4ba8-8d3b-385106b8392b\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " Apr 23 13:53:45.711781 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.711607 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmc7w\" (UniqueName: \"kubernetes.io/projected/a857907e-cba2-4ba8-8d3b-385106b8392b-kube-api-access-fmc7w\") pod \"a857907e-cba2-4ba8-8d3b-385106b8392b\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " Apr 23 13:53:45.711781 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.711668 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a857907e-cba2-4ba8-8d3b-385106b8392b-kserve-provision-location\") pod \"a857907e-cba2-4ba8-8d3b-385106b8392b\" (UID: \"a857907e-cba2-4ba8-8d3b-385106b8392b\") " Apr 23 13:53:45.712025 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.711923 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a857907e-cba2-4ba8-8d3b-385106b8392b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "a857907e-cba2-4ba8-8d3b-385106b8392b" (UID: "a857907e-cba2-4ba8-8d3b-385106b8392b"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:53:45.712129 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.712056 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a857907e-cba2-4ba8-8d3b-385106b8392b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a857907e-cba2-4ba8-8d3b-385106b8392b" (UID: "a857907e-cba2-4ba8-8d3b-385106b8392b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:45.713814 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.713787 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a857907e-cba2-4ba8-8d3b-385106b8392b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a857907e-cba2-4ba8-8d3b-385106b8392b" (UID: "a857907e-cba2-4ba8-8d3b-385106b8392b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:45.713902 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.713819 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a857907e-cba2-4ba8-8d3b-385106b8392b-kube-api-access-fmc7w" (OuterVolumeSpecName: "kube-api-access-fmc7w") pod "a857907e-cba2-4ba8-8d3b-385106b8392b" (UID: "a857907e-cba2-4ba8-8d3b-385106b8392b"). InnerVolumeSpecName "kube-api-access-fmc7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:45.812962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.812929 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a857907e-cba2-4ba8-8d3b-385106b8392b-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.812962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.812958 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a857907e-cba2-4ba8-8d3b-385106b8392b-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.812962 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.812969 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a857907e-cba2-4ba8-8d3b-385106b8392b-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.813177 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:45.812978 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmc7w\" (UniqueName: \"kubernetes.io/projected/a857907e-cba2-4ba8-8d3b-385106b8392b-kube-api-access-fmc7w\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:53:46.329646 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.329608 2565 generic.go:358] "Generic (PLEG): container finished" podID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerID="335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3" exitCode=137 Apr 23 13:53:46.329833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.329676 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerDied","Data":"335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3"} Apr 23 13:53:46.329833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.329697 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" Apr 23 13:53:46.329833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.329719 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds" event={"ID":"a857907e-cba2-4ba8-8d3b-385106b8392b","Type":"ContainerDied","Data":"ae510c1c59969f9683f709a6ff63fc2d9492ba9c05f09628167eaaf6444f98fa"} Apr 23 13:53:46.329833 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.329735 2565 scope.go:117] "RemoveContainer" containerID="cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463" Apr 23 13:53:46.338373 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.338355 2565 scope.go:117] "RemoveContainer" containerID="335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3" Apr 23 13:53:46.345140 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.345124 2565 scope.go:117] "RemoveContainer" containerID="08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6" Apr 23 13:53:46.351695 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.351680 2565 scope.go:117] "RemoveContainer" containerID="76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01" Apr 23 13:53:46.361040 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.360754 2565 scope.go:117] "RemoveContainer" containerID="cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463" Apr 23 13:53:46.361410 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:53:46.361384 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463\": container with ID starting with cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463 not found: ID does not exist" containerID="cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463" Apr 23 13:53:46.361496 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.361420 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463"} err="failed to get container status \"cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463\": rpc error: code = NotFound desc = could not find container \"cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463\": container with ID starting with cc0531f6340e14913f020bbdf0f92c8216c313e655d719b2458244e7cf5f1463 not found: ID does not exist" Apr 23 13:53:46.361496 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.361439 2565 scope.go:117] "RemoveContainer" containerID="335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3" Apr 23 13:53:46.361711 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:53:46.361687 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3\": container with ID starting with 335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3 not found: ID does not exist" containerID="335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3" Apr 23 13:53:46.361785 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.361709 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3"} err="failed to get container status \"335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3\": rpc error: code = NotFound desc = could not find container \"335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3\": container with ID starting with 335dffe8df36d83c1b045f559da3b92c354346c972604fcbf919569f02e904f3 not found: ID does not exist" Apr 23 13:53:46.361785 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.361723 2565 scope.go:117] "RemoveContainer" containerID="08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6" Apr 23 13:53:46.361997 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:53:46.361979 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6\": container with ID starting with 08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6 not found: ID does not exist" containerID="08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6" Apr 23 13:53:46.362049 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.362002 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6"} err="failed to get container status \"08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6\": rpc error: code = NotFound desc = could not find container \"08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6\": container with ID starting with 08abdde7e643821b3b57e11c50831e71546ef597fe60ba27134ac368b580bcf6 not found: ID does not exist" Apr 23 13:53:46.362049 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.362016 2565 scope.go:117] "RemoveContainer" containerID="76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01" Apr 23 13:53:46.362281 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:53:46.362258 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01\": container with ID starting with 76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01 not found: ID does not exist" containerID="76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01" Apr 23 13:53:46.362355 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.362286 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01"} err="failed to get container status \"76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01\": rpc error: code = NotFound desc = could not find container \"76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01\": container with ID starting with 76ed91be483a2295cabda6175f75ed6511b0a6f4b4f2a32fe90a2c623087fd01 not found: ID does not exist" Apr 23 13:53:46.363986 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.363963 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds"] Apr 23 13:53:46.365462 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:46.365443 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b76fc9db7-66dds"] Apr 23 13:53:47.840662 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:53:47.840627 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" path="/var/lib/kubelet/pods/a857907e-cba2-4ba8-8d3b-385106b8392b/volumes" Apr 23 13:56:29.818252 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:29.818211 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:56:29.820447 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:29.820427 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 13:56:30.184152 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184117 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq"] Apr 23 13:56:30.184461 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184447 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184463 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184480 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="storage-initializer" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184486 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="storage-initializer" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184498 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184504 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184513 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-agent" Apr 23 13:56:30.184523 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184518 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-agent" Apr 23 13:56:30.184799 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184565 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kube-rbac-proxy" Apr 23 13:56:30.184799 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184576 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-agent" Apr 23 13:56:30.184799 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.184583 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a857907e-cba2-4ba8-8d3b-385106b8392b" containerName="kserve-container" Apr 23 13:56:30.187630 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.187612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.190532 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.190511 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 23 13:56:30.190642 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.190515 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:56:30.191874 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.191851 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 23 13:56:30.192003 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.191974 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:56:30.192003 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.191990 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:56:30.198053 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.198030 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq"] Apr 23 13:56:30.198439 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.198418 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdd1332b-ce38-4b58-a86a-7bff464f3d53-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.198528 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.198467 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdd1332b-ce38-4b58-a86a-7bff464f3d53-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.198528 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.198515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.198650 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.198539 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464z5\" (UniqueName: \"kubernetes.io/projected/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kube-api-access-464z5\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.299214 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.299177 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdd1332b-ce38-4b58-a86a-7bff464f3d53-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.299405 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.299222 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.299405 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.299250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-464z5\" (UniqueName: \"kubernetes.io/projected/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kube-api-access-464z5\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.299405 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.299298 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdd1332b-ce38-4b58-a86a-7bff464f3d53-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.299714 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.299689 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.299914 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.299894 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdd1332b-ce38-4b58-a86a-7bff464f3d53-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.301864 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.301844 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdd1332b-ce38-4b58-a86a-7bff464f3d53-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.308132 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.308108 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-464z5\" (UniqueName: \"kubernetes.io/projected/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kube-api-access-464z5\") pod \"isvc-pmml-predictor-8bb578669-ksmbq\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.499164 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.499076 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:30.642334 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.642304 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq"] Apr 23 13:56:30.643792 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:56:30.643743 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd1332b_ce38_4b58_a86a_7bff464f3d53.slice/crio-0e50cf44abfd0e2861aba4aa9ceb3f94a00decaaf33c150c9ea0b1507f9b5d37 WatchSource:0}: Error finding container 0e50cf44abfd0e2861aba4aa9ceb3f94a00decaaf33c150c9ea0b1507f9b5d37: Status 404 returned error can't find the container with id 0e50cf44abfd0e2861aba4aa9ceb3f94a00decaaf33c150c9ea0b1507f9b5d37 Apr 23 13:56:30.860904 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.860813 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerStarted","Data":"24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57"} Apr 23 13:56:30.860904 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:30.860857 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerStarted","Data":"0e50cf44abfd0e2861aba4aa9ceb3f94a00decaaf33c150c9ea0b1507f9b5d37"} Apr 23 13:56:34.874171 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:34.874081 2565 generic.go:358] "Generic (PLEG): container finished" podID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerID="24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57" exitCode=0 Apr 23 13:56:34.874171 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:34.874156 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerDied","Data":"24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57"} Apr 23 13:56:41.909653 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:41.909576 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerStarted","Data":"66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802"} Apr 23 13:56:41.909653 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:41.909614 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerStarted","Data":"029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2"} Apr 23 13:56:41.910251 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:41.909878 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:41.929717 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:41.929677 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podStartSLOduration=5.209623989 podStartE2EDuration="11.929665591s" podCreationTimestamp="2026-04-23 13:56:30 +0000 UTC" firstStartedPulling="2026-04-23 13:56:34.875440454 +0000 UTC m=+1505.613984399" lastFinishedPulling="2026-04-23 13:56:41.595482042 +0000 UTC m=+1512.334026001" observedRunningTime="2026-04-23 13:56:41.927771851 +0000 UTC m=+1512.666315810" watchObservedRunningTime="2026-04-23 13:56:41.929665591 +0000 UTC m=+1512.668209554" Apr 23 13:56:42.913488 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:42.913457 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:42.914947 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:42.914918 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:56:43.916555 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:43.916517 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:56:48.922615 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:48.922579 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:56:48.923155 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:48.923129 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:56:58.923249 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:56:58.923205 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:57:08.923548 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:57:08.923511 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:57:18.923980 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:57:18.923939 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:57:28.923600 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:57:28.923564 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:57:38.923623 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:57:38.923583 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:57:48.924053 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:57:48.924010 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:57:58.923996 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:57:58.923958 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 13:58:08.923744 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:08.923699 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:58:11.331176 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:11.331144 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq"] Apr 23 13:58:11.331586 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:11.331447 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" containerID="cri-o://029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2" gracePeriod=30 Apr 23 13:58:11.331586 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:11.331512 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kube-rbac-proxy" containerID="cri-o://66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802" gracePeriod=30 Apr 23 13:58:12.202489 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:12.202453 2565 generic.go:358] "Generic (PLEG): container finished" podID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerID="66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802" exitCode=2 Apr 23 13:58:12.202675 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:12.202530 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerDied","Data":"66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802"} Apr 23 13:58:13.917099 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:13.917056 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 13:58:14.779787 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.779754 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:58:14.847313 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.847283 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kserve-provision-location\") pod \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " Apr 23 13:58:14.847313 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.847318 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-464z5\" (UniqueName: \"kubernetes.io/projected/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kube-api-access-464z5\") pod \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " Apr 23 13:58:14.847536 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.847344 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdd1332b-ce38-4b58-a86a-7bff464f3d53-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " Apr 23 13:58:14.847536 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.847388 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdd1332b-ce38-4b58-a86a-7bff464f3d53-proxy-tls\") pod \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\" (UID: \"bdd1332b-ce38-4b58-a86a-7bff464f3d53\") " Apr 23 13:58:14.847644 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.847605 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bdd1332b-ce38-4b58-a86a-7bff464f3d53" (UID: "bdd1332b-ce38-4b58-a86a-7bff464f3d53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:58:14.847742 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.847718 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdd1332b-ce38-4b58-a86a-7bff464f3d53-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "bdd1332b-ce38-4b58-a86a-7bff464f3d53" (UID: "bdd1332b-ce38-4b58-a86a-7bff464f3d53"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:58:14.849450 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.849425 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kube-api-access-464z5" (OuterVolumeSpecName: "kube-api-access-464z5") pod "bdd1332b-ce38-4b58-a86a-7bff464f3d53" (UID: "bdd1332b-ce38-4b58-a86a-7bff464f3d53"). InnerVolumeSpecName "kube-api-access-464z5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:58:14.849527 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.849483 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd1332b-ce38-4b58-a86a-7bff464f3d53-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bdd1332b-ce38-4b58-a86a-7bff464f3d53" (UID: "bdd1332b-ce38-4b58-a86a-7bff464f3d53"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:58:14.948823 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.948750 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:58:14.949293 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.948881 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-464z5\" (UniqueName: \"kubernetes.io/projected/bdd1332b-ce38-4b58-a86a-7bff464f3d53-kube-api-access-464z5\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:58:14.949293 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.948958 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bdd1332b-ce38-4b58-a86a-7bff464f3d53-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:58:14.949293 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:14.948970 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdd1332b-ce38-4b58-a86a-7bff464f3d53-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 13:58:15.215542 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.215460 2565 generic.go:358] "Generic (PLEG): container finished" podID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerID="029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2" exitCode=0 Apr 23 13:58:15.215542 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.215532 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerDied","Data":"029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2"} Apr 23 13:58:15.215542 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.215538 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" Apr 23 13:58:15.215843 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.215562 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq" event={"ID":"bdd1332b-ce38-4b58-a86a-7bff464f3d53","Type":"ContainerDied","Data":"0e50cf44abfd0e2861aba4aa9ceb3f94a00decaaf33c150c9ea0b1507f9b5d37"} Apr 23 13:58:15.215843 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.215576 2565 scope.go:117] "RemoveContainer" containerID="66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802" Apr 23 13:58:15.223642 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.223620 2565 scope.go:117] "RemoveContainer" containerID="029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2" Apr 23 13:58:15.235562 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.235538 2565 scope.go:117] "RemoveContainer" containerID="24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57" Apr 23 13:58:15.239401 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.239379 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq"] Apr 23 13:58:15.243329 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.243307 2565 scope.go:117] "RemoveContainer" containerID="66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802" Apr 23 13:58:15.243582 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:58:15.243565 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802\": container with ID starting with 66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802 not found: ID does not exist" containerID="66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802" Apr 23 13:58:15.243663 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.243594 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802"} err="failed to get container status \"66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802\": rpc error: code = NotFound desc = could not find container \"66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802\": container with ID starting with 66f690c5fe9766891b94443af6a6db3502b255a4866ce52acd0643afb22a9802 not found: ID does not exist" Apr 23 13:58:15.243663 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.243618 2565 scope.go:117] "RemoveContainer" containerID="029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2" Apr 23 13:58:15.243927 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:58:15.243905 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2\": container with ID starting with 029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2 not found: ID does not exist" containerID="029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2" Apr 23 13:58:15.243993 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.243932 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2"} err="failed to get container status \"029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2\": rpc error: code = NotFound desc = could not find container \"029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2\": container with ID starting with 029aed93cd05dcc6a24bc55cc63873811f8cc73bb7b571ba13acf82d6352f9a2 not found: ID does not exist" Apr 23 13:58:15.243993 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.243947 2565 scope.go:117] "RemoveContainer" containerID="24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57" Apr 23 13:58:15.244208 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:58:15.244191 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57\": container with ID starting with 24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57 not found: ID does not exist" containerID="24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57" Apr 23 13:58:15.244256 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.244212 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57"} err="failed to get container status \"24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57\": rpc error: code = NotFound desc = could not find container \"24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57\": container with ID starting with 24dbf7da106a646fde6b5fbaaa69dfff689ebdd1d650734549986aa035210a57 not found: ID does not exist" Apr 23 13:58:15.244425 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.244407 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-ksmbq"] Apr 23 13:58:15.840475 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:58:15.840440 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" path="/var/lib/kubelet/pods/bdd1332b-ce38-4b58-a86a-7bff464f3d53/volumes" Apr 23 13:59:52.820267 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820234 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72"] Apr 23 13:59:52.820670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820592 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" Apr 23 13:59:52.820670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820605 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" Apr 23 13:59:52.820670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820618 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kube-rbac-proxy" Apr 23 13:59:52.820670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820624 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kube-rbac-proxy" Apr 23 13:59:52.820670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820635 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="storage-initializer" Apr 23 13:59:52.820670 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820642 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="storage-initializer" Apr 23 13:59:52.820874 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820694 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kserve-container" Apr 23 13:59:52.820874 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.820707 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdd1332b-ce38-4b58-a86a-7bff464f3d53" containerName="kube-rbac-proxy" Apr 23 13:59:52.823855 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.823838 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:52.826578 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.826553 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 23 13:59:52.826917 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.826898 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:59:52.827965 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.827945 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:59:52.827965 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.827959 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:59:52.828131 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.827952 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 13:59:52.831424 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.831404 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72"] Apr 23 13:59:52.938403 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.938370 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:52.938572 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.938409 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecce91f-1b1d-4017-b4e9-b60876459339-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:52.938572 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.938452 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9q2m\" (UniqueName: \"kubernetes.io/projected/9ecce91f-1b1d-4017-b4e9-b60876459339-kube-api-access-g9q2m\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:52.938572 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:52.938523 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ecce91f-1b1d-4017-b4e9-b60876459339-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.039673 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.039639 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ecce91f-1b1d-4017-b4e9-b60876459339-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.039875 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.039711 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.039875 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.039745 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecce91f-1b1d-4017-b4e9-b60876459339-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.039875 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.039803 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9q2m\" (UniqueName: \"kubernetes.io/projected/9ecce91f-1b1d-4017-b4e9-b60876459339-kube-api-access-g9q2m\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.040006 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:59:53.039877 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 23 13:59:53.040006 ip-10-0-137-187 kubenswrapper[2565]: E0423 13:59:53.039967 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls podName:9ecce91f-1b1d-4017-b4e9-b60876459339 nodeName:}" failed. No retries permitted until 2026-04-23 13:59:53.53995162 +0000 UTC m=+1704.278495568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" (UID: "9ecce91f-1b1d-4017-b4e9-b60876459339") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 23 13:59:53.040158 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.040138 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecce91f-1b1d-4017-b4e9-b60876459339-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.040375 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.040356 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ecce91f-1b1d-4017-b4e9-b60876459339-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.049196 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.049178 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9q2m\" (UniqueName: \"kubernetes.io/projected/9ecce91f-1b1d-4017-b4e9-b60876459339-kube-api-access-g9q2m\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.543858 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.543817 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.546243 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.546214 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.735209 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.735179 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:53.854102 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.854058 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72"] Apr 23 13:59:53.855265 ip-10-0-137-187 kubenswrapper[2565]: W0423 13:59:53.855236 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ecce91f_1b1d_4017_b4e9_b60876459339.slice/crio-a58c91e2e721b8a3b5f72ab7ecf22fdac84c86baa0804d37eb6920eae96f36cd WatchSource:0}: Error finding container a58c91e2e721b8a3b5f72ab7ecf22fdac84c86baa0804d37eb6920eae96f36cd: Status 404 returned error can't find the container with id a58c91e2e721b8a3b5f72ab7ecf22fdac84c86baa0804d37eb6920eae96f36cd Apr 23 13:59:53.857092 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:53.857076 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:59:54.536792 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:54.536743 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerStarted","Data":"79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8"} Apr 23 13:59:54.536792 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:54.536796 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerStarted","Data":"a58c91e2e721b8a3b5f72ab7ecf22fdac84c86baa0804d37eb6920eae96f36cd"} Apr 23 13:59:57.548901 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:57.548864 2565 generic.go:358] "Generic (PLEG): container finished" podID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerID="79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8" exitCode=0 Apr 23 13:59:57.549277 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:57.548941 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerDied","Data":"79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8"} Apr 23 13:59:58.553770 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:58.553735 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerStarted","Data":"c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c"} Apr 23 13:59:58.554246 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:58.553788 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerStarted","Data":"546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204"} Apr 23 13:59:58.554246 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:58.554139 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:58.554351 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:58.554253 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 13:59:58.555589 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:58.555564 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 13:59:58.572807 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:58.572752 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podStartSLOduration=6.5727414060000005 podStartE2EDuration="6.572741406s" podCreationTimestamp="2026-04-23 13:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:59:58.571253423 +0000 UTC m=+1709.309797412" watchObservedRunningTime="2026-04-23 13:59:58.572741406 +0000 UTC m=+1709.311285371" Apr 23 13:59:59.556845 ip-10-0-137-187 kubenswrapper[2565]: I0423 13:59:59.556807 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:00.559734 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:00.559694 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:05.564682 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:05.564652 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 14:00:05.565174 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:05.565149 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:15.565887 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:15.565850 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:25.565512 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:25.565474 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:35.566025 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:35.565987 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:45.565546 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:45.565504 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:00:55.566194 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:00:55.566158 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:01:05.566126 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:05.566083 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:01:15.566255 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:15.566222 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 14:01:24.069201 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.069121 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72"] Apr 23 14:01:24.069682 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.069454 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" containerID="cri-o://546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204" gracePeriod=30 Apr 23 14:01:24.069682 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.069491 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kube-rbac-proxy" containerID="cri-o://c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c" gracePeriod=30 Apr 23 14:01:24.186122 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.186093 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp"] Apr 23 14:01:24.189461 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.189442 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.192700 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.192678 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-60b405-predictor-serving-cert\"" Apr 23 14:01:24.193359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.193340 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-60b405-kube-rbac-proxy-sar-config\"" Apr 23 14:01:24.204359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.204340 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp"] Apr 23 14:01:24.317119 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.317084 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d8cf45-3a36-432f-a475-ea780ee1551b-kserve-provision-location\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.317299 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.317133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1d8cf45-3a36-432f-a475-ea780ee1551b-isvc-primary-60b405-kube-rbac-proxy-sar-config\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.317299 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.317210 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.317299 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.317227 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkzx\" (UniqueName: \"kubernetes.io/projected/a1d8cf45-3a36-432f-a475-ea780ee1551b-kube-api-access-wbkzx\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.418556 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.418468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.418556 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.418509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkzx\" (UniqueName: \"kubernetes.io/projected/a1d8cf45-3a36-432f-a475-ea780ee1551b-kube-api-access-wbkzx\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.418847 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.418559 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d8cf45-3a36-432f-a475-ea780ee1551b-kserve-provision-location\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.418847 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.418609 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1d8cf45-3a36-432f-a475-ea780ee1551b-isvc-primary-60b405-kube-rbac-proxy-sar-config\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.418847 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:01:24.418638 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-60b405-predictor-serving-cert: secret "isvc-primary-60b405-predictor-serving-cert" not found Apr 23 14:01:24.418847 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:01:24.418716 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls podName:a1d8cf45-3a36-432f-a475-ea780ee1551b nodeName:}" failed. No retries permitted until 2026-04-23 14:01:24.918693792 +0000 UTC m=+1795.657237741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls") pod "isvc-primary-60b405-predictor-b697f8fbf-pgxzp" (UID: "a1d8cf45-3a36-432f-a475-ea780ee1551b") : secret "isvc-primary-60b405-predictor-serving-cert" not found Apr 23 14:01:24.419068 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.419047 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d8cf45-3a36-432f-a475-ea780ee1551b-kserve-provision-location\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.419298 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.419279 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1d8cf45-3a36-432f-a475-ea780ee1551b-isvc-primary-60b405-kube-rbac-proxy-sar-config\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.428049 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.428025 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkzx\" (UniqueName: \"kubernetes.io/projected/a1d8cf45-3a36-432f-a475-ea780ee1551b-kube-api-access-wbkzx\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.840383 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.840347 2565 generic.go:358] "Generic (PLEG): container finished" podID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerID="c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c" exitCode=2 Apr 23 14:01:24.840597 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.840422 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerDied","Data":"c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c"} Apr 23 14:01:24.923140 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.923105 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:24.925433 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:24.925403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls\") pod \"isvc-primary-60b405-predictor-b697f8fbf-pgxzp\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:25.099534 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:25.099447 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:25.222313 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:25.222288 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp"] Apr 23 14:01:25.224898 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:01:25.224868 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d8cf45_3a36_432f_a475_ea780ee1551b.slice/crio-d7e38ffd7a65c6abdf82ab09c4b9deaef1654e6f6d2cea24e9c3ff5bfe4d7f0d WatchSource:0}: Error finding container d7e38ffd7a65c6abdf82ab09c4b9deaef1654e6f6d2cea24e9c3ff5bfe4d7f0d: Status 404 returned error can't find the container with id d7e38ffd7a65c6abdf82ab09c4b9deaef1654e6f6d2cea24e9c3ff5bfe4d7f0d Apr 23 14:01:25.559992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:25.559953 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 14:01:25.565707 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:25.565673 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:01:25.844784 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:25.844692 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerStarted","Data":"ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac"} Apr 23 14:01:25.844784 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:25.844730 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerStarted","Data":"d7e38ffd7a65c6abdf82ab09c4b9deaef1654e6f6d2cea24e9c3ff5bfe4d7f0d"} Apr 23 14:01:27.619144 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.619123 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 14:01:27.747029 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.746929 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls\") pod \"9ecce91f-1b1d-4017-b4e9-b60876459339\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " Apr 23 14:01:27.747029 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.746981 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9q2m\" (UniqueName: \"kubernetes.io/projected/9ecce91f-1b1d-4017-b4e9-b60876459339-kube-api-access-g9q2m\") pod \"9ecce91f-1b1d-4017-b4e9-b60876459339\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " Apr 23 14:01:27.747314 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.747050 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ecce91f-1b1d-4017-b4e9-b60876459339-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"9ecce91f-1b1d-4017-b4e9-b60876459339\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " Apr 23 14:01:27.747314 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.747080 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecce91f-1b1d-4017-b4e9-b60876459339-kserve-provision-location\") pod \"9ecce91f-1b1d-4017-b4e9-b60876459339\" (UID: \"9ecce91f-1b1d-4017-b4e9-b60876459339\") " Apr 23 14:01:27.747455 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.747425 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ecce91f-1b1d-4017-b4e9-b60876459339-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ecce91f-1b1d-4017-b4e9-b60876459339" (UID: "9ecce91f-1b1d-4017-b4e9-b60876459339"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:01:27.747506 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.747436 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ecce91f-1b1d-4017-b4e9-b60876459339-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "9ecce91f-1b1d-4017-b4e9-b60876459339" (UID: "9ecce91f-1b1d-4017-b4e9-b60876459339"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:01:27.749150 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.749121 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ecce91f-1b1d-4017-b4e9-b60876459339" (UID: "9ecce91f-1b1d-4017-b4e9-b60876459339"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:01:27.749255 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.749128 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ecce91f-1b1d-4017-b4e9-b60876459339-kube-api-access-g9q2m" (OuterVolumeSpecName: "kube-api-access-g9q2m") pod "9ecce91f-1b1d-4017-b4e9-b60876459339" (UID: "9ecce91f-1b1d-4017-b4e9-b60876459339"). InnerVolumeSpecName "kube-api-access-g9q2m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:01:27.848276 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.848245 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ecce91f-1b1d-4017-b4e9-b60876459339-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:01:27.848276 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.848273 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecce91f-1b1d-4017-b4e9-b60876459339-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:01:27.848458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.848288 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ecce91f-1b1d-4017-b4e9-b60876459339-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:01:27.848458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.848302 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9q2m\" (UniqueName: \"kubernetes.io/projected/9ecce91f-1b1d-4017-b4e9-b60876459339-kube-api-access-g9q2m\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:01:27.852662 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.852636 2565 generic.go:358] "Generic (PLEG): container finished" podID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerID="546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204" exitCode=0 Apr 23 14:01:27.852777 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.852718 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" Apr 23 14:01:27.852777 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.852716 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerDied","Data":"546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204"} Apr 23 14:01:27.852777 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.852772 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72" event={"ID":"9ecce91f-1b1d-4017-b4e9-b60876459339","Type":"ContainerDied","Data":"a58c91e2e721b8a3b5f72ab7ecf22fdac84c86baa0804d37eb6920eae96f36cd"} Apr 23 14:01:27.852892 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.852792 2565 scope.go:117] "RemoveContainer" containerID="c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c" Apr 23 14:01:27.861494 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.861476 2565 scope.go:117] "RemoveContainer" containerID="546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204" Apr 23 14:01:27.868284 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.868270 2565 scope.go:117] "RemoveContainer" containerID="79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8" Apr 23 14:01:27.870922 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.870900 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72"] Apr 23 14:01:27.875153 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.875129 2565 scope.go:117] "RemoveContainer" containerID="c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c" Apr 23 14:01:27.875429 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:01:27.875401 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c\": container with ID starting with c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c not found: ID does not exist" containerID="c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c" Apr 23 14:01:27.875501 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.875430 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c"} err="failed to get container status \"c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c\": rpc error: code = NotFound desc = could not find container \"c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c\": container with ID starting with c6e99f4c748f496764c7f756404b82ab69b6ea56d86d3c4c1f5cb18286364f3c not found: ID does not exist" Apr 23 14:01:27.875501 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.875448 2565 scope.go:117] "RemoveContainer" containerID="546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204" Apr 23 14:01:27.876059 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:01:27.875883 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204\": container with ID starting with 546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204 not found: ID does not exist" containerID="546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204" Apr 23 14:01:27.876059 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.875914 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204"} err="failed to get container status \"546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204\": rpc error: code = NotFound desc = could not find container \"546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204\": container with ID starting with 546bdb4e5dfa789d846e5a14032ed3710f5517adba65eeb1ca8206c80a292204 not found: ID does not exist" Apr 23 14:01:27.876059 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.875935 2565 scope.go:117] "RemoveContainer" containerID="79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8" Apr 23 14:01:27.876280 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:01:27.876260 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8\": container with ID starting with 79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8 not found: ID does not exist" containerID="79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8" Apr 23 14:01:27.876331 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.876288 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8"} err="failed to get container status \"79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8\": rpc error: code = NotFound desc = could not find container \"79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8\": container with ID starting with 79f62f497e7c71b820c3a56a163495958132e25499f9964ce6a26b45ab3c0ba8 not found: ID does not exist" Apr 23 14:01:27.877524 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:27.877505 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-8px72"] Apr 23 14:01:28.857549 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:28.857517 2565 generic.go:358] "Generic (PLEG): container finished" podID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerID="ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac" exitCode=0 Apr 23 14:01:28.857887 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:28.857592 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerDied","Data":"ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac"} Apr 23 14:01:29.841829 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.841798 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" path="/var/lib/kubelet/pods/9ecce91f-1b1d-4017-b4e9-b60876459339/volumes" Apr 23 14:01:29.848873 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.848847 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:01:29.851985 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.851958 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:01:29.862217 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.862190 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerStarted","Data":"a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0"} Apr 23 14:01:29.862559 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.862222 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerStarted","Data":"3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf"} Apr 23 14:01:29.862559 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.862434 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:29.881725 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:29.881599 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podStartSLOduration=5.881584308 podStartE2EDuration="5.881584308s" podCreationTimestamp="2026-04-23 14:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:01:29.881425817 +0000 UTC m=+1800.619969794" watchObservedRunningTime="2026-04-23 14:01:29.881584308 +0000 UTC m=+1800.620128272" Apr 23 14:01:30.865600 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:30.865571 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:30.866630 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:30.866597 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:01:31.869050 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:31.869005 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:01:36.873633 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:36.873603 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:01:36.874210 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:36.874185 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:01:46.874567 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:46.874522 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:01:56.874958 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:01:56.874917 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:02:06.874391 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:06.874348 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:02:16.874587 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:16.874551 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:02:26.875015 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:26.874974 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 23 14:02:36.875528 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:36.875500 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:02:44.285294 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285256 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq"] Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285738 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kube-rbac-proxy" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285754 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kube-rbac-proxy" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285790 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285797 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285806 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="storage-initializer" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285812 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="storage-initializer" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285876 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kube-rbac-proxy" Apr 23 14:02:44.287723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.285886 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ecce91f-1b1d-4017-b4e9-b60876459339" containerName="kserve-container" Apr 23 14:02:44.288884 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.288864 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.291644 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.291613 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-60b405-predictor-serving-cert\"" Apr 23 14:02:44.291793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.291647 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-60b405-dockercfg-nfv9z\"" Apr 23 14:02:44.291793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.291676 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-60b405-kube-rbac-proxy-sar-config\"" Apr 23 14:02:44.291793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.291743 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-60b405\"" Apr 23 14:02:44.292086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.292068 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 14:02:44.300963 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.300945 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq"] Apr 23 14:02:44.389353 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.389312 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95xzz\" (UniqueName: \"kubernetes.io/projected/b8a2fddf-2b57-4512-bf66-0e002570b224-kube-api-access-95xzz\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.389353 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.389360 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-cabundle-cert\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.389600 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.389435 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.389600 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.389522 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-isvc-secondary-60b405-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.389600 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.389564 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a2fddf-2b57-4512-bf66-0e002570b224-kserve-provision-location\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490043 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490084 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-isvc-secondary-60b405-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490307 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490112 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a2fddf-2b57-4512-bf66-0e002570b224-kserve-provision-location\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490307 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490184 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95xzz\" (UniqueName: \"kubernetes.io/projected/b8a2fddf-2b57-4512-bf66-0e002570b224-kube-api-access-95xzz\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490307 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:02:44.490199 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-60b405-predictor-serving-cert: secret "isvc-secondary-60b405-predictor-serving-cert" not found Apr 23 14:02:44.490307 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:02:44.490283 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls podName:b8a2fddf-2b57-4512-bf66-0e002570b224 nodeName:}" failed. No retries permitted until 2026-04-23 14:02:44.990261904 +0000 UTC m=+1875.728805863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls") pod "isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" (UID: "b8a2fddf-2b57-4512-bf66-0e002570b224") : secret "isvc-secondary-60b405-predictor-serving-cert" not found Apr 23 14:02:44.490307 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490279 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-cabundle-cert\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490554 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a2fddf-2b57-4512-bf66-0e002570b224-kserve-provision-location\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490852 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-isvc-secondary-60b405-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.490898 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.490850 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-cabundle-cert\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.501890 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.501863 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95xzz\" (UniqueName: \"kubernetes.io/projected/b8a2fddf-2b57-4512-bf66-0e002570b224-kube-api-access-95xzz\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.995514 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.995474 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:44.997679 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:44.997652 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls\") pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:45.198859 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:45.198825 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:02:45.324459 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:45.324434 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq"] Apr 23 14:02:45.326592 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:02:45.326567 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a2fddf_2b57_4512_bf66_0e002570b224.slice/crio-878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1 WatchSource:0}: Error finding container 878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1: Status 404 returned error can't find the container with id 878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1 Apr 23 14:02:46.111394 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:46.111357 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" event={"ID":"b8a2fddf-2b57-4512-bf66-0e002570b224","Type":"ContainerStarted","Data":"6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b"} Apr 23 14:02:46.111394 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:46.111396 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" event={"ID":"b8a2fddf-2b57-4512-bf66-0e002570b224","Type":"ContainerStarted","Data":"878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1"} Apr 23 14:02:49.121715 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:49.121635 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/0.log" Apr 23 14:02:49.121715 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:49.121679 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerID="6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b" exitCode=1 Apr 23 14:02:49.122144 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:49.121752 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" event={"ID":"b8a2fddf-2b57-4512-bf66-0e002570b224","Type":"ContainerDied","Data":"6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b"} Apr 23 14:02:50.125914 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:50.125879 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/0.log" Apr 23 14:02:50.126382 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:50.125982 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" event={"ID":"b8a2fddf-2b57-4512-bf66-0e002570b224","Type":"ContainerStarted","Data":"14d7badc940cd798619ae68531df02aedae20f203504c78465677cb58bc185b8"} Apr 23 14:02:54.140103 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:54.140054 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/1.log" Apr 23 14:02:54.140486 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:54.140399 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/0.log" Apr 23 14:02:54.140486 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:54.140429 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerID="14d7badc940cd798619ae68531df02aedae20f203504c78465677cb58bc185b8" exitCode=1 Apr 23 14:02:54.140567 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:54.140517 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" event={"ID":"b8a2fddf-2b57-4512-bf66-0e002570b224","Type":"ContainerDied","Data":"14d7badc940cd798619ae68531df02aedae20f203504c78465677cb58bc185b8"} Apr 23 14:02:54.140567 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:54.140563 2565 scope.go:117] "RemoveContainer" containerID="6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b" Apr 23 14:02:54.141041 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:54.141018 2565 scope.go:117] "RemoveContainer" containerID="6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b" Apr 23 14:02:54.151044 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:02:54.151017 2565 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_kserve-ci-e2e-test_b8a2fddf-2b57-4512-bf66-0e002570b224_0 in pod sandbox 878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1 from index: no such id: '6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b'" containerID="6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b" Apr 23 14:02:54.151106 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:02:54.151066 2565 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_kserve-ci-e2e-test_b8a2fddf-2b57-4512-bf66-0e002570b224_0 in pod sandbox 878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1 from index: no such id: '6bfe1c8e86e1a59b80c3a604f48c69ce780c117dd2693574521a90735baf318b'; Skipping pod \"isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_kserve-ci-e2e-test(b8a2fddf-2b57-4512-bf66-0e002570b224)\"" logger="UnhandledError" Apr 23 14:02:54.152392 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:02:54.152370 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_kserve-ci-e2e-test(b8a2fddf-2b57-4512-bf66-0e002570b224)\"" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" Apr 23 14:02:55.145038 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:02:55.145007 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/1.log" Apr 23 14:03:00.320809 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.320747 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp"] Apr 23 14:03:00.321245 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.321109 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" containerID="cri-o://3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf" gracePeriod=30 Apr 23 14:03:00.321245 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.321138 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kube-rbac-proxy" containerID="cri-o://a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0" gracePeriod=30 Apr 23 14:03:00.392989 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.392955 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq"] Apr 23 14:03:00.512143 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.512113 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx"] Apr 23 14:03:00.521539 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.521502 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.523837 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.523817 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/1.log" Apr 23 14:03:00.523974 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.523870 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:03:00.525059 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.525037 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\"" Apr 23 14:03:00.525446 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.525410 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-a0e2f1\"" Apr 23 14:03:00.525959 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.525907 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a0e2f1-predictor-serving-cert\"" Apr 23 14:03:00.526195 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.526171 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-a0e2f1-dockercfg-rn2kp\"" Apr 23 14:03:00.529088 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.529067 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx"] Apr 23 14:03:00.633975 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.633877 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-cabundle-cert\") pod \"b8a2fddf-2b57-4512-bf66-0e002570b224\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " Apr 23 14:03:00.633975 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.633932 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-isvc-secondary-60b405-kube-rbac-proxy-sar-config\") pod \"b8a2fddf-2b57-4512-bf66-0e002570b224\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " Apr 23 14:03:00.634206 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634058 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95xzz\" (UniqueName: \"kubernetes.io/projected/b8a2fddf-2b57-4512-bf66-0e002570b224-kube-api-access-95xzz\") pod \"b8a2fddf-2b57-4512-bf66-0e002570b224\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " Apr 23 14:03:00.634206 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634196 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls\") pod \"b8a2fddf-2b57-4512-bf66-0e002570b224\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " Apr 23 14:03:00.634307 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634235 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a2fddf-2b57-4512-bf66-0e002570b224-kserve-provision-location\") pod \"b8a2fddf-2b57-4512-bf66-0e002570b224\" (UID: \"b8a2fddf-2b57-4512-bf66-0e002570b224\") " Apr 23 14:03:00.634362 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634302 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-isvc-secondary-60b405-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-60b405-kube-rbac-proxy-sar-config") pod "b8a2fddf-2b57-4512-bf66-0e002570b224" (UID: "b8a2fddf-2b57-4512-bf66-0e002570b224"). InnerVolumeSpecName "isvc-secondary-60b405-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:00.634362 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634307 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "b8a2fddf-2b57-4512-bf66-0e002570b224" (UID: "b8a2fddf-2b57-4512-bf66-0e002570b224"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:00.634471 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634365 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-proxy-tls\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.634569 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634543 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a2fddf-2b57-4512-bf66-0e002570b224-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b8a2fddf-2b57-4512-bf66-0e002570b224" (UID: "b8a2fddf-2b57-4512-bf66-0e002570b224"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:03:00.634706 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634630 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9k7\" (UniqueName: \"kubernetes.io/projected/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kube-api-access-9x9k7\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.634706 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634669 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kserve-provision-location\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.634849 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634704 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-cabundle-cert\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.634849 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634830 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.634946 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634902 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a2fddf-2b57-4512-bf66-0e002570b224-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:00.634946 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634920 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-cabundle-cert\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:00.634946 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.634936 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8a2fddf-2b57-4512-bf66-0e002570b224-isvc-secondary-60b405-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:00.636380 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.636354 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a2fddf-2b57-4512-bf66-0e002570b224-kube-api-access-95xzz" (OuterVolumeSpecName: "kube-api-access-95xzz") pod "b8a2fddf-2b57-4512-bf66-0e002570b224" (UID: "b8a2fddf-2b57-4512-bf66-0e002570b224"). InnerVolumeSpecName "kube-api-access-95xzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:00.636380 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.636370 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b8a2fddf-2b57-4512-bf66-0e002570b224" (UID: "b8a2fddf-2b57-4512-bf66-0e002570b224"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:00.735924 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.735877 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.735924 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.735930 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-proxy-tls\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.736191 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736052 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9k7\" (UniqueName: \"kubernetes.io/projected/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kube-api-access-9x9k7\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.736191 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kserve-provision-location\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.736191 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736168 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-cabundle-cert\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.736319 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736238 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a2fddf-2b57-4512-bf66-0e002570b224-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:00.736319 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736254 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-95xzz\" (UniqueName: \"kubernetes.io/projected/b8a2fddf-2b57-4512-bf66-0e002570b224-kube-api-access-95xzz\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:00.736555 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kserve-provision-location\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.736786 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736744 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.736978 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.736957 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-cabundle-cert\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.738568 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.738550 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-proxy-tls\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.744849 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.744829 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9k7\" (UniqueName: \"kubernetes.io/projected/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kube-api-access-9x9k7\") pod \"isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.832955 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.832912 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:00.958827 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:00.958799 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx"] Apr 23 14:03:00.960735 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:03:00.960707 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde9395a1_f7d6_44c5_8967_fc5c16c11cc4.slice/crio-6eb078f1c6d845022a416eae2475ce9d769a7be642b92d9e5f09bae82937ab86 WatchSource:0}: Error finding container 6eb078f1c6d845022a416eae2475ce9d769a7be642b92d9e5f09bae82937ab86: Status 404 returned error can't find the container with id 6eb078f1c6d845022a416eae2475ce9d769a7be642b92d9e5f09bae82937ab86 Apr 23 14:03:01.168694 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.168602 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" event={"ID":"de9395a1-f7d6-44c5-8967-fc5c16c11cc4","Type":"ContainerStarted","Data":"be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17"} Apr 23 14:03:01.168694 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.168644 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" event={"ID":"de9395a1-f7d6-44c5-8967-fc5c16c11cc4","Type":"ContainerStarted","Data":"6eb078f1c6d845022a416eae2475ce9d769a7be642b92d9e5f09bae82937ab86"} Apr 23 14:03:01.169753 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.169731 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq_b8a2fddf-2b57-4512-bf66-0e002570b224/storage-initializer/1.log" Apr 23 14:03:01.169898 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.169869 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" event={"ID":"b8a2fddf-2b57-4512-bf66-0e002570b224","Type":"ContainerDied","Data":"878e3de60c848fcf175609660ee740ea68b7e8e10c3eec8536e99a51be4b97e1"} Apr 23 14:03:01.169974 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.169916 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq" Apr 23 14:03:01.169974 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.169925 2565 scope.go:117] "RemoveContainer" containerID="14d7badc940cd798619ae68531df02aedae20f203504c78465677cb58bc185b8" Apr 23 14:03:01.171977 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.171950 2565 generic.go:358] "Generic (PLEG): container finished" podID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerID="a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0" exitCode=2 Apr 23 14:03:01.172098 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.171988 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerDied","Data":"a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0"} Apr 23 14:03:01.230084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.230050 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq"] Apr 23 14:03:01.234431 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.234401 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-60b405-predictor-685bf5b5fd-zsvqq"] Apr 23 14:03:01.844420 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.844378 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" path="/var/lib/kubelet/pods/b8a2fddf-2b57-4512-bf66-0e002570b224/volumes" Apr 23 14:03:01.869176 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:01.869134 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 23 14:03:04.768360 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.768337 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:03:04.874459 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.874355 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls\") pod \"a1d8cf45-3a36-432f-a475-ea780ee1551b\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " Apr 23 14:03:04.874459 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.874426 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d8cf45-3a36-432f-a475-ea780ee1551b-kserve-provision-location\") pod \"a1d8cf45-3a36-432f-a475-ea780ee1551b\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " Apr 23 14:03:04.874663 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.874487 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1d8cf45-3a36-432f-a475-ea780ee1551b-isvc-primary-60b405-kube-rbac-proxy-sar-config\") pod \"a1d8cf45-3a36-432f-a475-ea780ee1551b\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " Apr 23 14:03:04.874663 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.874522 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbkzx\" (UniqueName: \"kubernetes.io/projected/a1d8cf45-3a36-432f-a475-ea780ee1551b-kube-api-access-wbkzx\") pod \"a1d8cf45-3a36-432f-a475-ea780ee1551b\" (UID: \"a1d8cf45-3a36-432f-a475-ea780ee1551b\") " Apr 23 14:03:04.874853 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.874822 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d8cf45-3a36-432f-a475-ea780ee1551b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a1d8cf45-3a36-432f-a475-ea780ee1551b" (UID: "a1d8cf45-3a36-432f-a475-ea780ee1551b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:03:04.874971 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.874847 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d8cf45-3a36-432f-a475-ea780ee1551b-isvc-primary-60b405-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-60b405-kube-rbac-proxy-sar-config") pod "a1d8cf45-3a36-432f-a475-ea780ee1551b" (UID: "a1d8cf45-3a36-432f-a475-ea780ee1551b"). InnerVolumeSpecName "isvc-primary-60b405-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:04.876519 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.876492 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a1d8cf45-3a36-432f-a475-ea780ee1551b" (UID: "a1d8cf45-3a36-432f-a475-ea780ee1551b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:04.876607 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.876569 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d8cf45-3a36-432f-a475-ea780ee1551b-kube-api-access-wbkzx" (OuterVolumeSpecName: "kube-api-access-wbkzx") pod "a1d8cf45-3a36-432f-a475-ea780ee1551b" (UID: "a1d8cf45-3a36-432f-a475-ea780ee1551b"). InnerVolumeSpecName "kube-api-access-wbkzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:04.975598 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.975562 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1d8cf45-3a36-432f-a475-ea780ee1551b-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:04.975598 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.975595 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d8cf45-3a36-432f-a475-ea780ee1551b-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:04.975828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.975610 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-60b405-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1d8cf45-3a36-432f-a475-ea780ee1551b-isvc-primary-60b405-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:04.975828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:04.975626 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbkzx\" (UniqueName: \"kubernetes.io/projected/a1d8cf45-3a36-432f-a475-ea780ee1551b-kube-api-access-wbkzx\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:05.186574 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.186541 2565 generic.go:358] "Generic (PLEG): container finished" podID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerID="3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf" exitCode=0 Apr 23 14:03:05.186778 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.186602 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerDied","Data":"3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf"} Apr 23 14:03:05.186778 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.186633 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" event={"ID":"a1d8cf45-3a36-432f-a475-ea780ee1551b","Type":"ContainerDied","Data":"d7e38ffd7a65c6abdf82ab09c4b9deaef1654e6f6d2cea24e9c3ff5bfe4d7f0d"} Apr 23 14:03:05.186778 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.186643 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp" Apr 23 14:03:05.186778 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.186648 2565 scope.go:117] "RemoveContainer" containerID="a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0" Apr 23 14:03:05.195170 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.195153 2565 scope.go:117] "RemoveContainer" containerID="3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf" Apr 23 14:03:05.202288 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.202269 2565 scope.go:117] "RemoveContainer" containerID="ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac" Apr 23 14:03:05.209258 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.209238 2565 scope.go:117] "RemoveContainer" containerID="a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0" Apr 23 14:03:05.209488 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:03:05.209461 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0\": container with ID starting with a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0 not found: ID does not exist" containerID="a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0" Apr 23 14:03:05.209546 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.209495 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0"} err="failed to get container status \"a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0\": rpc error: code = NotFound desc = could not find container \"a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0\": container with ID starting with a90c4068c64ff0650b00241d1a856f5414f2e48156cb1eb152bc70fe1b6081d0 not found: ID does not exist" Apr 23 14:03:05.209546 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.209528 2565 scope.go:117] "RemoveContainer" containerID="3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf" Apr 23 14:03:05.209788 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:03:05.209753 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf\": container with ID starting with 3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf not found: ID does not exist" containerID="3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf" Apr 23 14:03:05.209836 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.209796 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf"} err="failed to get container status \"3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf\": rpc error: code = NotFound desc = could not find container \"3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf\": container with ID starting with 3b5b7da47715ab263114fb6662b23dc2e2695a699fede92b7ca00d685ae95ecf not found: ID does not exist" Apr 23 14:03:05.209836 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.209813 2565 scope.go:117] "RemoveContainer" containerID="ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac" Apr 23 14:03:05.210048 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:03:05.210031 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac\": container with ID starting with ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac not found: ID does not exist" containerID="ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac" Apr 23 14:03:05.210094 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.210053 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac"} err="failed to get container status \"ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac\": rpc error: code = NotFound desc = could not find container \"ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac\": container with ID starting with ca0e61bf176148a41198ac079a787c1bcb952e22a6e6e35b7965c661ebce26ac not found: ID does not exist" Apr 23 14:03:05.215604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.215578 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp"] Apr 23 14:03:05.217240 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.217223 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-60b405-predictor-b697f8fbf-pgxzp"] Apr 23 14:03:05.841776 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:05.841733 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" path="/var/lib/kubelet/pods/a1d8cf45-3a36-432f-a475-ea780ee1551b/volumes" Apr 23 14:03:08.199170 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:08.199141 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx_de9395a1-f7d6-44c5-8967-fc5c16c11cc4/storage-initializer/0.log" Apr 23 14:03:08.199822 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:08.199182 2565 generic.go:358] "Generic (PLEG): container finished" podID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerID="be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17" exitCode=1 Apr 23 14:03:08.199822 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:08.199263 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" event={"ID":"de9395a1-f7d6-44c5-8967-fc5c16c11cc4","Type":"ContainerDied","Data":"be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17"} Apr 23 14:03:09.204580 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:09.204553 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx_de9395a1-f7d6-44c5-8967-fc5c16c11cc4/storage-initializer/0.log" Apr 23 14:03:09.204973 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:09.204632 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" event={"ID":"de9395a1-f7d6-44c5-8967-fc5c16c11cc4","Type":"ContainerStarted","Data":"3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67"} Apr 23 14:03:10.540180 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:10.540148 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx"] Apr 23 14:03:10.540555 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:10.540463 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" containerID="cri-o://3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67" gracePeriod=30 Apr 23 14:03:12.586561 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.586538 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx_de9395a1-f7d6-44c5-8967-fc5c16c11cc4/storage-initializer/1.log" Apr 23 14:03:12.587010 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.586994 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx_de9395a1-f7d6-44c5-8967-fc5c16c11cc4/storage-initializer/0.log" Apr 23 14:03:12.587073 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.587062 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:12.638853 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.638817 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-proxy-tls\") pod \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " Apr 23 14:03:12.638853 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.638854 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kserve-provision-location\") pod \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " Apr 23 14:03:12.639097 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.638875 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x9k7\" (UniqueName: \"kubernetes.io/projected/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kube-api-access-9x9k7\") pod \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " Apr 23 14:03:12.639097 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.638909 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\") pod \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " Apr 23 14:03:12.639097 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.638928 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-cabundle-cert\") pod \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\" (UID: \"de9395a1-f7d6-44c5-8967-fc5c16c11cc4\") " Apr 23 14:03:12.639247 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.639198 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de9395a1-f7d6-44c5-8967-fc5c16c11cc4" (UID: "de9395a1-f7d6-44c5-8967-fc5c16c11cc4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:03:12.639361 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.639332 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config") pod "de9395a1-f7d6-44c5-8967-fc5c16c11cc4" (UID: "de9395a1-f7d6-44c5-8967-fc5c16c11cc4"). InnerVolumeSpecName "isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:12.639361 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.639350 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "de9395a1-f7d6-44c5-8967-fc5c16c11cc4" (UID: "de9395a1-f7d6-44c5-8967-fc5c16c11cc4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:12.641091 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.641069 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "de9395a1-f7d6-44c5-8967-fc5c16c11cc4" (UID: "de9395a1-f7d6-44c5-8967-fc5c16c11cc4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:12.641189 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.641130 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kube-api-access-9x9k7" (OuterVolumeSpecName: "kube-api-access-9x9k7") pod "de9395a1-f7d6-44c5-8967-fc5c16c11cc4" (UID: "de9395a1-f7d6-44c5-8967-fc5c16c11cc4"). InnerVolumeSpecName "kube-api-access-9x9k7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:12.739801 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.739693 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:12.739801 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.739723 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:12.739801 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.739735 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9x9k7\" (UniqueName: \"kubernetes.io/projected/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-kube-api-access-9x9k7\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:12.739801 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.739745 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-isvc-init-fail-a0e2f1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:12.739801 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:12.739772 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/de9395a1-f7d6-44c5-8967-fc5c16c11cc4-cabundle-cert\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:03:13.220253 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220226 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx_de9395a1-f7d6-44c5-8967-fc5c16c11cc4/storage-initializer/1.log" Apr 23 14:03:13.220578 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220563 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx_de9395a1-f7d6-44c5-8967-fc5c16c11cc4/storage-initializer/0.log" Apr 23 14:03:13.220631 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220599 2565 generic.go:358] "Generic (PLEG): container finished" podID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerID="3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67" exitCode=1 Apr 23 14:03:13.220680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220666 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" event={"ID":"de9395a1-f7d6-44c5-8967-fc5c16c11cc4","Type":"ContainerDied","Data":"3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67"} Apr 23 14:03:13.220716 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220682 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" Apr 23 14:03:13.220716 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220696 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx" event={"ID":"de9395a1-f7d6-44c5-8967-fc5c16c11cc4","Type":"ContainerDied","Data":"6eb078f1c6d845022a416eae2475ce9d769a7be642b92d9e5f09bae82937ab86"} Apr 23 14:03:13.220716 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.220711 2565 scope.go:117] "RemoveContainer" containerID="3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67" Apr 23 14:03:13.229604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.229581 2565 scope.go:117] "RemoveContainer" containerID="be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17" Apr 23 14:03:13.236528 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.236509 2565 scope.go:117] "RemoveContainer" containerID="3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67" Apr 23 14:03:13.236795 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:03:13.236773 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67\": container with ID starting with 3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67 not found: ID does not exist" containerID="3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67" Apr 23 14:03:13.236840 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.236804 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67"} err="failed to get container status \"3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67\": rpc error: code = NotFound desc = could not find container \"3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67\": container with ID starting with 3b2bc2bf50af5c056618607dc6cf2e6a42cfd98c0ed8c85688aa66c57878aa67 not found: ID does not exist" Apr 23 14:03:13.236840 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.236823 2565 scope.go:117] "RemoveContainer" containerID="be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17" Apr 23 14:03:13.237056 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:03:13.237037 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17\": container with ID starting with be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17 not found: ID does not exist" containerID="be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17" Apr 23 14:03:13.237094 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.237063 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17"} err="failed to get container status \"be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17\": rpc error: code = NotFound desc = could not find container \"be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17\": container with ID starting with be9ec2c5c375025cddfe3c1974e4fd5c316c50a77f82b16c4e707deecba1ca17 not found: ID does not exist" Apr 23 14:03:13.260320 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.260293 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx"] Apr 23 14:03:13.265048 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.265016 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0e2f1-predictor-5559d5df4d-4h9sx"] Apr 23 14:03:13.840304 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:03:13.840274 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" path="/var/lib/kubelet/pods/de9395a1-f7d6-44c5-8967-fc5c16c11cc4/volumes" Apr 23 14:06:29.869780 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:06:29.869737 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:06:29.873830 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:06:29.873804 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:11:29.889964 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:11:29.889933 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:11:29.903183 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:11:29.903158 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:12:34.891547 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891509 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z"] Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891900 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891912 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891926 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891931 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891938 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891943 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891952 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891962 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891973 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kube-rbac-proxy" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891979 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kube-rbac-proxy" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891987 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="storage-initializer" Apr 23 14:12:34.891992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.891992 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="storage-initializer" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892043 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerName="storage-initializer" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892053 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892059 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kube-rbac-proxy" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892067 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1d8cf45-3a36-432f-a475-ea780ee1551b" containerName="kserve-container" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892073 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerName="storage-initializer" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892137 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerName="storage-initializer" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892144 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a2fddf-2b57-4512-bf66-0e002570b224" containerName="storage-initializer" Apr 23 14:12:34.892389 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.892198 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="de9395a1-f7d6-44c5-8967-fc5c16c11cc4" containerName="storage-initializer" Apr 23 14:12:34.895181 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.895163 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:34.898121 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.898096 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 23 14:12:34.898236 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.898139 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:12:34.898236 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.898162 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:12:34.899458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.899441 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 23 14:12:34.899528 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.899512 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:12:34.906369 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.906345 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z"] Apr 23 14:12:34.967169 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.967134 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6h77\" (UniqueName: \"kubernetes.io/projected/55e61ed0-3e7c-4154-99e6-e436c42b916d-kube-api-access-l6h77\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:34.967384 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.967177 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55e61ed0-3e7c-4154-99e6-e436c42b916d-proxy-tls\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:34.967384 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.967202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55e61ed0-3e7c-4154-99e6-e436c42b916d-kserve-provision-location\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:34.967384 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:34.967261 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55e61ed0-3e7c-4154-99e6-e436c42b916d-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.068257 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.068222 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6h77\" (UniqueName: \"kubernetes.io/projected/55e61ed0-3e7c-4154-99e6-e436c42b916d-kube-api-access-l6h77\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.068458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.068283 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55e61ed0-3e7c-4154-99e6-e436c42b916d-proxy-tls\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.068458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.068317 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55e61ed0-3e7c-4154-99e6-e436c42b916d-kserve-provision-location\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.068458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.068341 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55e61ed0-3e7c-4154-99e6-e436c42b916d-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.068772 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.068734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55e61ed0-3e7c-4154-99e6-e436c42b916d-kserve-provision-location\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.069089 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.069055 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55e61ed0-3e7c-4154-99e6-e436c42b916d-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.070678 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.070657 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55e61ed0-3e7c-4154-99e6-e436c42b916d-proxy-tls\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.076812 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.076790 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6h77\" (UniqueName: \"kubernetes.io/projected/55e61ed0-3e7c-4154-99e6-e436c42b916d-kube-api-access-l6h77\") pod \"isvc-sklearn-predictor-759d546688-cwt2z\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.205922 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.205889 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:35.326497 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.326453 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z"] Apr 23 14:12:35.328701 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:12:35.328673 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e61ed0_3e7c_4154_99e6_e436c42b916d.slice/crio-fccdbd96e09b0732c783aa7a9981507179a2d227233931084ba0571f90629179 WatchSource:0}: Error finding container fccdbd96e09b0732c783aa7a9981507179a2d227233931084ba0571f90629179: Status 404 returned error can't find the container with id fccdbd96e09b0732c783aa7a9981507179a2d227233931084ba0571f90629179 Apr 23 14:12:35.330525 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:35.330506 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:12:36.018503 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:36.018468 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerStarted","Data":"3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f"} Apr 23 14:12:36.018503 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:36.018508 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerStarted","Data":"fccdbd96e09b0732c783aa7a9981507179a2d227233931084ba0571f90629179"} Apr 23 14:12:39.029322 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:39.029292 2565 generic.go:358] "Generic (PLEG): container finished" podID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerID="3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f" exitCode=0 Apr 23 14:12:39.029631 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:39.029361 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerDied","Data":"3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f"} Apr 23 14:12:40.035112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:40.035074 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerStarted","Data":"fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4"} Apr 23 14:12:40.035112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:40.035114 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerStarted","Data":"f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8"} Apr 23 14:12:40.035678 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:40.035303 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:40.066191 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:40.066141 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podStartSLOduration=6.06612655 podStartE2EDuration="6.06612655s" podCreationTimestamp="2026-04-23 14:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:12:40.064274844 +0000 UTC m=+2470.802818810" watchObservedRunningTime="2026-04-23 14:12:40.06612655 +0000 UTC m=+2470.804670516" Apr 23 14:12:41.038238 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:41.038200 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:41.039672 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:41.039643 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:12:42.041309 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:42.041271 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:12:47.046435 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:47.046403 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:12:47.046957 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:47.046932 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:12:57.047317 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:12:57.047276 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:13:07.047131 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:07.047085 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:13:17.047452 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:17.047405 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:13:27.047886 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:27.047799 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:13:37.047882 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:37.047844 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:13:47.047647 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:47.047618 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:13:55.010284 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:55.010240 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z"] Apr 23 14:13:55.010801 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:55.010660 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" containerID="cri-o://f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8" gracePeriod=30 Apr 23 14:13:55.011121 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:55.010994 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kube-rbac-proxy" containerID="cri-o://fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4" gracePeriod=30 Apr 23 14:13:55.286073 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:55.285985 2565 generic.go:358] "Generic (PLEG): container finished" podID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerID="fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4" exitCode=2 Apr 23 14:13:55.286073 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:55.286060 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerDied","Data":"fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4"} Apr 23 14:13:57.042343 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:57.042302 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 23 14:13:57.047727 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:57.047701 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 14:13:59.259187 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.259159 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:13:59.300786 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.300689 2565 generic.go:358] "Generic (PLEG): container finished" podID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerID="f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8" exitCode=0 Apr 23 14:13:59.300944 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.300784 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" Apr 23 14:13:59.300944 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.300784 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerDied","Data":"f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8"} Apr 23 14:13:59.300944 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.300893 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z" event={"ID":"55e61ed0-3e7c-4154-99e6-e436c42b916d","Type":"ContainerDied","Data":"fccdbd96e09b0732c783aa7a9981507179a2d227233931084ba0571f90629179"} Apr 23 14:13:59.300944 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.300909 2565 scope.go:117] "RemoveContainer" containerID="fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4" Apr 23 14:13:59.308851 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.308792 2565 scope.go:117] "RemoveContainer" containerID="f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8" Apr 23 14:13:59.315682 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.315667 2565 scope.go:117] "RemoveContainer" containerID="3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f" Apr 23 14:13:59.322308 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.322293 2565 scope.go:117] "RemoveContainer" containerID="fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4" Apr 23 14:13:59.322540 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:13:59.322523 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4\": container with ID starting with fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4 not found: ID does not exist" containerID="fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4" Apr 23 14:13:59.322596 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.322546 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4"} err="failed to get container status \"fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4\": rpc error: code = NotFound desc = could not find container \"fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4\": container with ID starting with fa86c8eb38b1112bb0805b79f40af3664006c6b8127a434ccf2d8c89b2c5d7a4 not found: ID does not exist" Apr 23 14:13:59.322596 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.322563 2565 scope.go:117] "RemoveContainer" containerID="f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8" Apr 23 14:13:59.322840 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:13:59.322754 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8\": container with ID starting with f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8 not found: ID does not exist" containerID="f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8" Apr 23 14:13:59.322840 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.322797 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8"} err="failed to get container status \"f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8\": rpc error: code = NotFound desc = could not find container \"f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8\": container with ID starting with f28700c2a54706c0c1bc09747a83c5eada1f322f9e2193fea17b47c833a587f8 not found: ID does not exist" Apr 23 14:13:59.322840 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.322809 2565 scope.go:117] "RemoveContainer" containerID="3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f" Apr 23 14:13:59.323033 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:13:59.323014 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f\": container with ID starting with 3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f not found: ID does not exist" containerID="3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f" Apr 23 14:13:59.323071 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.323038 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f"} err="failed to get container status \"3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f\": rpc error: code = NotFound desc = could not find container \"3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f\": container with ID starting with 3a8dc9ddf97f318e690ec76745ec1517f162c2c4a3e0f2d94bdd142d44eabd7f not found: ID does not exist" Apr 23 14:13:59.381975 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.381952 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55e61ed0-3e7c-4154-99e6-e436c42b916d-proxy-tls\") pod \"55e61ed0-3e7c-4154-99e6-e436c42b916d\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " Apr 23 14:13:59.382090 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.381997 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6h77\" (UniqueName: \"kubernetes.io/projected/55e61ed0-3e7c-4154-99e6-e436c42b916d-kube-api-access-l6h77\") pod \"55e61ed0-3e7c-4154-99e6-e436c42b916d\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " Apr 23 14:13:59.382090 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.382019 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55e61ed0-3e7c-4154-99e6-e436c42b916d-kserve-provision-location\") pod \"55e61ed0-3e7c-4154-99e6-e436c42b916d\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " Apr 23 14:13:59.382090 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.382069 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55e61ed0-3e7c-4154-99e6-e436c42b916d-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"55e61ed0-3e7c-4154-99e6-e436c42b916d\" (UID: \"55e61ed0-3e7c-4154-99e6-e436c42b916d\") " Apr 23 14:13:59.382357 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.382328 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e61ed0-3e7c-4154-99e6-e436c42b916d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "55e61ed0-3e7c-4154-99e6-e436c42b916d" (UID: "55e61ed0-3e7c-4154-99e6-e436c42b916d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:13:59.382529 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.382507 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e61ed0-3e7c-4154-99e6-e436c42b916d-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "55e61ed0-3e7c-4154-99e6-e436c42b916d" (UID: "55e61ed0-3e7c-4154-99e6-e436c42b916d"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:13:59.384074 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.384055 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e61ed0-3e7c-4154-99e6-e436c42b916d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "55e61ed0-3e7c-4154-99e6-e436c42b916d" (UID: "55e61ed0-3e7c-4154-99e6-e436c42b916d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:13:59.384186 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.384166 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e61ed0-3e7c-4154-99e6-e436c42b916d-kube-api-access-l6h77" (OuterVolumeSpecName: "kube-api-access-l6h77") pod "55e61ed0-3e7c-4154-99e6-e436c42b916d" (UID: "55e61ed0-3e7c-4154-99e6-e436c42b916d"). InnerVolumeSpecName "kube-api-access-l6h77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:13:59.482793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.482744 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l6h77\" (UniqueName: \"kubernetes.io/projected/55e61ed0-3e7c-4154-99e6-e436c42b916d-kube-api-access-l6h77\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:13:59.482793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.482790 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55e61ed0-3e7c-4154-99e6-e436c42b916d-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:13:59.482793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.482802 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55e61ed0-3e7c-4154-99e6-e436c42b916d-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:13:59.483039 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.482812 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55e61ed0-3e7c-4154-99e6-e436c42b916d-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:13:59.629301 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.629267 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z"] Apr 23 14:13:59.632769 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.632721 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-759d546688-cwt2z"] Apr 23 14:13:59.839868 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:13:59.839836 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" path="/var/lib/kubelet/pods/55e61ed0-3e7c-4154-99e6-e436c42b916d/volumes" Apr 23 14:14:55.243812 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.243713 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc"] Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244075 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244087 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244101 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kube-rbac-proxy" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244110 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kube-rbac-proxy" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244124 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="storage-initializer" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244131 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="storage-initializer" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244180 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kserve-container" Apr 23 14:14:55.244209 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.244194 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="55e61ed0-3e7c-4154-99e6-e436c42b916d" containerName="kube-rbac-proxy" Apr 23 14:14:55.247383 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.247363 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.250207 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.250185 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 23 14:14:55.250327 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.250189 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 23 14:14:55.250327 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.250187 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:14:55.250638 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.250609 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:14:55.250836 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.250818 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:14:55.259522 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.259498 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc"] Apr 23 14:14:55.316545 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.316513 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6489\" (UniqueName: \"kubernetes.io/projected/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kube-api-access-m6489\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.316751 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.316561 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.316751 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.316704 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.316751 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.316753 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.417852 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.417808 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.418042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.417868 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.418042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.417902 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6489\" (UniqueName: \"kubernetes.io/projected/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kube-api-access-m6489\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.418042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.417940 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.418367 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.418345 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.418552 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.418533 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.420278 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.420260 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.429042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.429018 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6489\" (UniqueName: \"kubernetes.io/projected/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kube-api-access-m6489\") pod \"isvc-sklearn-runtime-predictor-65764ccccd-rl7nc\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.557954 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.557851 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:14:55.685880 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:55.685854 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc"] Apr 23 14:14:55.688363 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:14:55.688320 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4457cc9_5847_41ce_8b63_7a0f26ca3db5.slice/crio-02377bfcc40e309141ca7d0c02ad1ec463750e452a187afb911403b474e44ec3 WatchSource:0}: Error finding container 02377bfcc40e309141ca7d0c02ad1ec463750e452a187afb911403b474e44ec3: Status 404 returned error can't find the container with id 02377bfcc40e309141ca7d0c02ad1ec463750e452a187afb911403b474e44ec3 Apr 23 14:14:56.495091 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:56.495054 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerStarted","Data":"94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637"} Apr 23 14:14:56.495091 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:14:56.495091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerStarted","Data":"02377bfcc40e309141ca7d0c02ad1ec463750e452a187afb911403b474e44ec3"} Apr 23 14:15:01.513464 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:01.513426 2565 generic.go:358] "Generic (PLEG): container finished" podID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerID="94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637" exitCode=0 Apr 23 14:15:01.513955 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:01.513507 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerDied","Data":"94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637"} Apr 23 14:15:02.519169 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:02.519135 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerStarted","Data":"b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c"} Apr 23 14:15:02.519550 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:02.519179 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerStarted","Data":"6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9"} Apr 23 14:15:02.519550 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:02.519462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:15:02.519644 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:02.519614 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:15:02.521190 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:02.521162 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 23 14:15:02.542592 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:02.542533 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" podStartSLOduration=7.542518278 podStartE2EDuration="7.542518278s" podCreationTimestamp="2026-04-23 14:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:15:02.540548472 +0000 UTC m=+2613.279092440" watchObservedRunningTime="2026-04-23 14:15:02.542518278 +0000 UTC m=+2613.281062315" Apr 23 14:15:03.522846 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:03.522805 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 23 14:15:08.528916 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:08.528878 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:15:08.529444 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:08.529408 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 23 14:15:18.529940 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:18.529910 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:15:32.178751 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:32.178709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-65764ccccd-rl7nc_b4457cc9-5847-41ce-8b63-7a0f26ca3db5/kserve-container/0.log" Apr 23 14:15:32.437086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:32.436980 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc"] Apr 23 14:15:32.437367 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:32.437318 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" containerID="cri-o://6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9" gracePeriod=30 Apr 23 14:15:32.437433 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:32.437369 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kube-rbac-proxy" containerID="cri-o://b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c" gracePeriod=30 Apr 23 14:15:32.624774 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:32.624735 2565 generic.go:358] "Generic (PLEG): container finished" podID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerID="b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c" exitCode=2 Apr 23 14:15:32.624966 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:32.624809 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerDied","Data":"b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c"} Apr 23 14:15:33.387056 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.387032 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:15:33.454617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.454519 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " Apr 23 14:15:33.454617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.454573 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kserve-provision-location\") pod \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " Apr 23 14:15:33.454617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.454616 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6489\" (UniqueName: \"kubernetes.io/projected/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kube-api-access-m6489\") pod \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " Apr 23 14:15:33.454932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.454684 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-proxy-tls\") pod \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\" (UID: \"b4457cc9-5847-41ce-8b63-7a0f26ca3db5\") " Apr 23 14:15:33.454932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.454915 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "b4457cc9-5847-41ce-8b63-7a0f26ca3db5" (UID: "b4457cc9-5847-41ce-8b63-7a0f26ca3db5"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:15:33.456877 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.456844 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kube-api-access-m6489" (OuterVolumeSpecName: "kube-api-access-m6489") pod "b4457cc9-5847-41ce-8b63-7a0f26ca3db5" (UID: "b4457cc9-5847-41ce-8b63-7a0f26ca3db5"). InnerVolumeSpecName "kube-api-access-m6489". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:15:33.456877 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.456865 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4457cc9-5847-41ce-8b63-7a0f26ca3db5" (UID: "b4457cc9-5847-41ce-8b63-7a0f26ca3db5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:15:33.486283 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.486241 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4457cc9-5847-41ce-8b63-7a0f26ca3db5" (UID: "b4457cc9-5847-41ce-8b63-7a0f26ca3db5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:15:33.555364 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.555329 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6489\" (UniqueName: \"kubernetes.io/projected/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kube-api-access-m6489\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:15:33.555364 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.555360 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:15:33.555524 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.555372 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:15:33.555524 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.555387 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4457cc9-5847-41ce-8b63-7a0f26ca3db5-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:15:33.630282 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.630250 2565 generic.go:358] "Generic (PLEG): container finished" podID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerID="6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9" exitCode=0 Apr 23 14:15:33.630454 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.630325 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" Apr 23 14:15:33.630454 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.630325 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerDied","Data":"6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9"} Apr 23 14:15:33.630454 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.630426 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc" event={"ID":"b4457cc9-5847-41ce-8b63-7a0f26ca3db5","Type":"ContainerDied","Data":"02377bfcc40e309141ca7d0c02ad1ec463750e452a187afb911403b474e44ec3"} Apr 23 14:15:33.630607 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.630445 2565 scope.go:117] "RemoveContainer" containerID="b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c" Apr 23 14:15:33.638740 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.638723 2565 scope.go:117] "RemoveContainer" containerID="6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9" Apr 23 14:15:33.646584 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.646567 2565 scope.go:117] "RemoveContainer" containerID="94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637" Apr 23 14:15:33.654132 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.654000 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc"] Apr 23 14:15:33.654197 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.654140 2565 scope.go:117] "RemoveContainer" containerID="b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c" Apr 23 14:15:33.654430 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:15:33.654411 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c\": container with ID starting with b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c not found: ID does not exist" containerID="b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c" Apr 23 14:15:33.654473 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.654439 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c"} err="failed to get container status \"b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c\": rpc error: code = NotFound desc = could not find container \"b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c\": container with ID starting with b604a7631cc38553c5db52206c1d18be047a8cbcf2720c5b2cf07adc78905e0c not found: ID does not exist" Apr 23 14:15:33.654473 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.654457 2565 scope.go:117] "RemoveContainer" containerID="6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9" Apr 23 14:15:33.654710 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:15:33.654690 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9\": container with ID starting with 6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9 not found: ID does not exist" containerID="6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9" Apr 23 14:15:33.654867 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.654717 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9"} err="failed to get container status \"6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9\": rpc error: code = NotFound desc = could not find container \"6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9\": container with ID starting with 6c7e12ffd7c71afc300d83fa56c2bf8835d2ea090096af73b67209386bdedfb9 not found: ID does not exist" Apr 23 14:15:33.654867 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.654734 2565 scope.go:117] "RemoveContainer" containerID="94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637" Apr 23 14:15:33.654999 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:15:33.654980 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637\": container with ID starting with 94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637 not found: ID does not exist" containerID="94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637" Apr 23 14:15:33.655046 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.655005 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637"} err="failed to get container status \"94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637\": rpc error: code = NotFound desc = could not find container \"94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637\": container with ID starting with 94e422fec06d7a3bcbfec4ed7649ca5f6d4c18c24b3c21972ef653fbe10a5637 not found: ID does not exist" Apr 23 14:15:33.657213 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.657192 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65764ccccd-rl7nc"] Apr 23 14:15:33.841648 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:15:33.841565 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" path="/var/lib/kubelet/pods/b4457cc9-5847-41ce-8b63-7a0f26ca3db5/volumes" Apr 23 14:16:29.915371 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:29.915295 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:16:29.924153 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:29.924126 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:16:32.575862 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.575825 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll"] Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576152 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="storage-initializer" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576167 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="storage-initializer" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576185 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576191 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576201 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kube-rbac-proxy" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576207 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kube-rbac-proxy" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576262 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kube-rbac-proxy" Apr 23 14:16:32.576281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.576271 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4457cc9-5847-41ce-8b63-7a0f26ca3db5" containerName="kserve-container" Apr 23 14:16:32.580695 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.580674 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.583518 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.583489 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 23 14:16:32.583653 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.583495 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 23 14:16:32.583653 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.583507 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:16:32.583833 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.583819 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:16:32.584946 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.584929 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:16:32.593991 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.593965 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll"] Apr 23 14:16:32.728118 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.728079 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b100e97-f1c7-4541-8195-3ded4e7208ea-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.728308 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.728174 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvww\" (UniqueName: \"kubernetes.io/projected/4b100e97-f1c7-4541-8195-3ded4e7208ea-kube-api-access-2pvww\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.728308 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.728199 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.728308 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.728225 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b100e97-f1c7-4541-8195-3ded4e7208ea-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.829700 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.829609 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvww\" (UniqueName: \"kubernetes.io/projected/4b100e97-f1c7-4541-8195-3ded4e7208ea-kube-api-access-2pvww\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.829700 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.829663 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.829700 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.829693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b100e97-f1c7-4541-8195-3ded4e7208ea-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.830007 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.829794 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b100e97-f1c7-4541-8195-3ded4e7208ea-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.830007 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:16:32.829828 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 23 14:16:32.830007 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:16:32.829924 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls podName:4b100e97-f1c7-4541-8195-3ded4e7208ea nodeName:}" failed. No retries permitted until 2026-04-23 14:16:33.329901257 +0000 UTC m=+2704.068445216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls") pod "isvc-sklearn-v2-predictor-6d65c564d6-29jll" (UID: "4b100e97-f1c7-4541-8195-3ded4e7208ea") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 23 14:16:32.830309 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.830279 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b100e97-f1c7-4541-8195-3ded4e7208ea-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.830504 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.830480 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b100e97-f1c7-4541-8195-3ded4e7208ea-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:32.840822 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:32.840797 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvww\" (UniqueName: \"kubernetes.io/projected/4b100e97-f1c7-4541-8195-3ded4e7208ea-kube-api-access-2pvww\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:33.333104 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:33.333067 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:33.335594 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:33.335572 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6d65c564d6-29jll\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:33.491133 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:33.491095 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:33.613280 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:33.613258 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll"] Apr 23 14:16:33.615770 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:16:33.615727 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b100e97_f1c7_4541_8195_3ded4e7208ea.slice/crio-426bcfefe3db20ba56dd18be514d2743eea9488c54e613db57e3af81280b0dde WatchSource:0}: Error finding container 426bcfefe3db20ba56dd18be514d2743eea9488c54e613db57e3af81280b0dde: Status 404 returned error can't find the container with id 426bcfefe3db20ba56dd18be514d2743eea9488c54e613db57e3af81280b0dde Apr 23 14:16:33.840523 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:33.840484 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerStarted","Data":"398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830"} Apr 23 14:16:33.840686 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:33.840534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerStarted","Data":"426bcfefe3db20ba56dd18be514d2743eea9488c54e613db57e3af81280b0dde"} Apr 23 14:16:37.850781 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:37.850714 2565 generic.go:358] "Generic (PLEG): container finished" podID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerID="398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830" exitCode=0 Apr 23 14:16:37.850781 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:37.850752 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerDied","Data":"398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830"} Apr 23 14:16:38.858745 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:38.858703 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerStarted","Data":"9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5"} Apr 23 14:16:38.858745 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:38.858747 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerStarted","Data":"013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6"} Apr 23 14:16:38.859205 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:38.859077 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:38.879330 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:38.879275 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podStartSLOduration=6.879261267 podStartE2EDuration="6.879261267s" podCreationTimestamp="2026-04-23 14:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:16:38.876705932 +0000 UTC m=+2709.615249897" watchObservedRunningTime="2026-04-23 14:16:38.879261267 +0000 UTC m=+2709.617805233" Apr 23 14:16:39.861789 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:39.861732 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:39.862980 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:39.862954 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:16:40.864908 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:40.864865 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:16:45.868961 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:45.868929 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:16:45.869541 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:45.869513 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:16:55.869576 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:16:55.869533 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:17:05.869927 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:05.869888 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:17:15.869550 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:15.869507 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:17:25.869488 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:25.869448 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:17:35.870276 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:35.870240 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:17:45.870419 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:45.870390 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:17:52.722159 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.722077 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll"] Apr 23 14:17:52.722699 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.722544 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" containerID="cri-o://013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6" gracePeriod=30 Apr 23 14:17:52.722699 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.722629 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kube-rbac-proxy" containerID="cri-o://9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5" gracePeriod=30 Apr 23 14:17:52.808680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.808643 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt"] Apr 23 14:17:52.812541 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.812522 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.815289 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.815266 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 23 14:17:52.815412 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.815310 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 23 14:17:52.822502 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.822482 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt"] Apr 23 14:17:52.892523 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.892488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpmxk\" (UniqueName: \"kubernetes.io/projected/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kube-api-access-dpmxk\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.892687 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.892532 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.892687 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.892584 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.892687 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.892623 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.993336 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.993251 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.993336 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.993308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpmxk\" (UniqueName: \"kubernetes.io/projected/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kube-api-access-dpmxk\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.993548 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.993345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.993548 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.993387 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.993548 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:17:52.993404 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 23 14:17:52.993548 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:17:52.993475 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls podName:9e2c1596-af9b-4bf5-8dfa-7809f66fed63 nodeName:}" failed. No retries permitted until 2026-04-23 14:17:53.493457939 +0000 UTC m=+2784.232001883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" (UID: "9e2c1596-af9b-4bf5-8dfa-7809f66fed63") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 23 14:17:52.993823 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.993807 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:52.994048 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:52.994030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:53.002402 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.002384 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpmxk\" (UniqueName: \"kubernetes.io/projected/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kube-api-access-dpmxk\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:53.094590 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.094558 2565 generic.go:358] "Generic (PLEG): container finished" podID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerID="9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5" exitCode=2 Apr 23 14:17:53.094751 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.094634 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerDied","Data":"9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5"} Apr 23 14:17:53.496503 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.496456 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:53.498864 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.498844 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:53.723307 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.723257 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:53.847847 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.847820 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt"] Apr 23 14:17:53.850552 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:17:53.850528 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e2c1596_af9b_4bf5_8dfa_7809f66fed63.slice/crio-f61d8b55e604746019817929bc3c3a968e78dc4ebe8ace5200bd4f2386aed583 WatchSource:0}: Error finding container f61d8b55e604746019817929bc3c3a968e78dc4ebe8ace5200bd4f2386aed583: Status 404 returned error can't find the container with id f61d8b55e604746019817929bc3c3a968e78dc4ebe8ace5200bd4f2386aed583 Apr 23 14:17:53.852190 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:53.852175 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:17:54.099500 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:54.099410 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerStarted","Data":"c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e"} Apr 23 14:17:54.099500 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:54.099451 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerStarted","Data":"f61d8b55e604746019817929bc3c3a968e78dc4ebe8ace5200bd4f2386aed583"} Apr 23 14:17:55.866148 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:55.866102 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 23 14:17:55.870427 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:55.870391 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 14:17:57.259184 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.259160 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:17:57.328436 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.328327 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b100e97-f1c7-4541-8195-3ded4e7208ea-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"4b100e97-f1c7-4541-8195-3ded4e7208ea\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " Apr 23 14:17:57.328436 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.328403 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b100e97-f1c7-4541-8195-3ded4e7208ea-kserve-provision-location\") pod \"4b100e97-f1c7-4541-8195-3ded4e7208ea\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " Apr 23 14:17:57.328436 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.328438 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvww\" (UniqueName: \"kubernetes.io/projected/4b100e97-f1c7-4541-8195-3ded4e7208ea-kube-api-access-2pvww\") pod \"4b100e97-f1c7-4541-8195-3ded4e7208ea\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " Apr 23 14:17:57.328725 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.328485 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls\") pod \"4b100e97-f1c7-4541-8195-3ded4e7208ea\" (UID: \"4b100e97-f1c7-4541-8195-3ded4e7208ea\") " Apr 23 14:17:57.328725 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.328685 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b100e97-f1c7-4541-8195-3ded4e7208ea-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "4b100e97-f1c7-4541-8195-3ded4e7208ea" (UID: "4b100e97-f1c7-4541-8195-3ded4e7208ea"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:17:57.328825 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.328722 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b100e97-f1c7-4541-8195-3ded4e7208ea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4b100e97-f1c7-4541-8195-3ded4e7208ea" (UID: "4b100e97-f1c7-4541-8195-3ded4e7208ea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:17:57.330594 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.330573 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4b100e97-f1c7-4541-8195-3ded4e7208ea" (UID: "4b100e97-f1c7-4541-8195-3ded4e7208ea"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:17:57.330680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.330570 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b100e97-f1c7-4541-8195-3ded4e7208ea-kube-api-access-2pvww" (OuterVolumeSpecName: "kube-api-access-2pvww") pod "4b100e97-f1c7-4541-8195-3ded4e7208ea" (UID: "4b100e97-f1c7-4541-8195-3ded4e7208ea"). InnerVolumeSpecName "kube-api-access-2pvww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:17:57.429680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.429646 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b100e97-f1c7-4541-8195-3ded4e7208ea-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:17:57.429680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.429676 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pvww\" (UniqueName: \"kubernetes.io/projected/4b100e97-f1c7-4541-8195-3ded4e7208ea-kube-api-access-2pvww\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:17:57.429680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.429685 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b100e97-f1c7-4541-8195-3ded4e7208ea-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:17:57.429951 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:57.429696 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4b100e97-f1c7-4541-8195-3ded4e7208ea-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:17:58.115101 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.114999 2565 generic.go:358] "Generic (PLEG): container finished" podID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerID="c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e" exitCode=0 Apr 23 14:17:58.115101 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.115072 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerDied","Data":"c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e"} Apr 23 14:17:58.116748 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.116718 2565 generic.go:358] "Generic (PLEG): container finished" podID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerID="013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6" exitCode=0 Apr 23 14:17:58.116874 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.116804 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" Apr 23 14:17:58.116874 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.116803 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerDied","Data":"013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6"} Apr 23 14:17:58.116874 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.116838 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll" event={"ID":"4b100e97-f1c7-4541-8195-3ded4e7208ea","Type":"ContainerDied","Data":"426bcfefe3db20ba56dd18be514d2743eea9488c54e613db57e3af81280b0dde"} Apr 23 14:17:58.116874 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.116855 2565 scope.go:117] "RemoveContainer" containerID="9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5" Apr 23 14:17:58.127597 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.127578 2565 scope.go:117] "RemoveContainer" containerID="013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6" Apr 23 14:17:58.143192 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.143051 2565 scope.go:117] "RemoveContainer" containerID="398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830" Apr 23 14:17:58.149009 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.148983 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll"] Apr 23 14:17:58.154582 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.154559 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6d65c564d6-29jll"] Apr 23 14:17:58.157473 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.157455 2565 scope.go:117] "RemoveContainer" containerID="9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5" Apr 23 14:17:58.157798 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:17:58.157752 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5\": container with ID starting with 9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5 not found: ID does not exist" containerID="9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5" Apr 23 14:17:58.157890 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.157812 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5"} err="failed to get container status \"9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5\": rpc error: code = NotFound desc = could not find container \"9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5\": container with ID starting with 9859af5b6a38bf52db7137888e5caf29c6b78d086e7946a13049bb130c99b5f5 not found: ID does not exist" Apr 23 14:17:58.157890 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.157840 2565 scope.go:117] "RemoveContainer" containerID="013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6" Apr 23 14:17:58.158127 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:17:58.158110 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6\": container with ID starting with 013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6 not found: ID does not exist" containerID="013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6" Apr 23 14:17:58.158180 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.158141 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6"} err="failed to get container status \"013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6\": rpc error: code = NotFound desc = could not find container \"013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6\": container with ID starting with 013d8344760b3bc6b89b1cf78e8f529d277f938990cfe591b228672a4c77ffc6 not found: ID does not exist" Apr 23 14:17:58.158180 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.158158 2565 scope.go:117] "RemoveContainer" containerID="398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830" Apr 23 14:17:58.158354 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:17:58.158337 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830\": container with ID starting with 398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830 not found: ID does not exist" containerID="398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830" Apr 23 14:17:58.158409 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:58.158361 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830"} err="failed to get container status \"398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830\": rpc error: code = NotFound desc = could not find container \"398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830\": container with ID starting with 398ddf2d599b35b86429c6eee56014c5dfc22b14dcd055ffc434695b6fe56830 not found: ID does not exist" Apr 23 14:17:59.123453 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.123413 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerStarted","Data":"7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d"} Apr 23 14:17:59.123453 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.123454 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerStarted","Data":"0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630"} Apr 23 14:17:59.123953 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.123779 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:59.123953 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.123927 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:17:59.125235 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.125213 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:17:59.144362 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.144311 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podStartSLOduration=7.144296635 podStartE2EDuration="7.144296635s" podCreationTimestamp="2026-04-23 14:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:17:59.141678493 +0000 UTC m=+2789.880222458" watchObservedRunningTime="2026-04-23 14:17:59.144296635 +0000 UTC m=+2789.882840601" Apr 23 14:17:59.842100 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:17:59.842066 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" path="/var/lib/kubelet/pods/4b100e97-f1c7-4541-8195-3ded4e7208ea/volumes" Apr 23 14:18:00.128881 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:00.128783 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:18:05.133487 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:05.133448 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:18:05.134089 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:05.134061 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:18:15.134272 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:15.134225 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:18:25.134558 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:25.134517 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:18:35.134662 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:35.134624 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:18:45.134012 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:45.133968 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:18:55.134295 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:18:55.134254 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:19:05.134936 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:05.134908 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:19:12.942063 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:12.942029 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt"] Apr 23 14:19:12.942744 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:12.942688 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" containerID="cri-o://0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630" gracePeriod=30 Apr 23 14:19:12.942900 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:12.942719 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kube-rbac-proxy" containerID="cri-o://7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d" gracePeriod=30 Apr 23 14:19:13.375532 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:13.375443 2565 generic.go:358] "Generic (PLEG): container finished" podID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerID="7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d" exitCode=2 Apr 23 14:19:13.375532 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:13.375510 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerDied","Data":"7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d"} Apr 23 14:19:15.129156 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:15.129115 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 23 14:19:15.134908 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:15.134873 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 23 14:19:17.278060 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.278035 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:19:17.387862 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.387831 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpmxk\" (UniqueName: \"kubernetes.io/projected/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kube-api-access-dpmxk\") pod \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " Apr 23 14:19:17.388030 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.387878 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " Apr 23 14:19:17.388030 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.387906 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kserve-provision-location\") pod \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " Apr 23 14:19:17.388030 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.387970 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls\") pod \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\" (UID: \"9e2c1596-af9b-4bf5-8dfa-7809f66fed63\") " Apr 23 14:19:17.388254 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.388230 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9e2c1596-af9b-4bf5-8dfa-7809f66fed63" (UID: "9e2c1596-af9b-4bf5-8dfa-7809f66fed63"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:19:17.388444 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.388402 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "9e2c1596-af9b-4bf5-8dfa-7809f66fed63" (UID: "9e2c1596-af9b-4bf5-8dfa-7809f66fed63"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:19:17.390106 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.390080 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kube-api-access-dpmxk" (OuterVolumeSpecName: "kube-api-access-dpmxk") pod "9e2c1596-af9b-4bf5-8dfa-7809f66fed63" (UID: "9e2c1596-af9b-4bf5-8dfa-7809f66fed63"). InnerVolumeSpecName "kube-api-access-dpmxk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:19:17.390198 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.390128 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9e2c1596-af9b-4bf5-8dfa-7809f66fed63" (UID: "9e2c1596-af9b-4bf5-8dfa-7809f66fed63"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:19:17.394002 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.393941 2565 generic.go:358] "Generic (PLEG): container finished" podID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerID="0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630" exitCode=0 Apr 23 14:19:17.394086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.394002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerDied","Data":"0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630"} Apr 23 14:19:17.394086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.394031 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" Apr 23 14:19:17.394086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.394038 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt" event={"ID":"9e2c1596-af9b-4bf5-8dfa-7809f66fed63","Type":"ContainerDied","Data":"f61d8b55e604746019817929bc3c3a968e78dc4ebe8ace5200bd4f2386aed583"} Apr 23 14:19:17.394086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.394058 2565 scope.go:117] "RemoveContainer" containerID="7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d" Apr 23 14:19:17.405726 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.405712 2565 scope.go:117] "RemoveContainer" containerID="0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630" Apr 23 14:19:17.412813 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.412795 2565 scope.go:117] "RemoveContainer" containerID="c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e" Apr 23 14:19:17.419238 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.419215 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt"] Apr 23 14:19:17.420370 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.420349 2565 scope.go:117] "RemoveContainer" containerID="7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d" Apr 23 14:19:17.420655 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:19:17.420627 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d\": container with ID starting with 7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d not found: ID does not exist" containerID="7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d" Apr 23 14:19:17.420746 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.420676 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d"} err="failed to get container status \"7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d\": rpc error: code = NotFound desc = could not find container \"7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d\": container with ID starting with 7b06443383efd7594d7ff6ce9da81e2815837867562a34e3743b502bbe0e3f5d not found: ID does not exist" Apr 23 14:19:17.420746 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.420703 2565 scope.go:117] "RemoveContainer" containerID="0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630" Apr 23 14:19:17.421043 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:19:17.421004 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630\": container with ID starting with 0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630 not found: ID does not exist" containerID="0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630" Apr 23 14:19:17.421043 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.421032 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630"} err="failed to get container status \"0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630\": rpc error: code = NotFound desc = could not find container \"0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630\": container with ID starting with 0ed72d39829cd730a3ee72a6970bdb9ac6c7c445eb4cc045a062acb6791db630 not found: ID does not exist" Apr 23 14:19:17.421286 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.421056 2565 scope.go:117] "RemoveContainer" containerID="c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e" Apr 23 14:19:17.421436 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:19:17.421416 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e\": container with ID starting with c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e not found: ID does not exist" containerID="c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e" Apr 23 14:19:17.421495 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.421447 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e"} err="failed to get container status \"c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e\": rpc error: code = NotFound desc = could not find container \"c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e\": container with ID starting with c6538dae52ce9e9a5678e19df0ef153eb2b503d67334346662c428ff8f0ab04e not found: ID does not exist" Apr 23 14:19:17.422693 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.422675 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6b4bf45459-t6gnt"] Apr 23 14:19:17.489134 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.489091 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:19:17.489134 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.489127 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:19:17.489134 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.489140 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:19:17.489379 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.489149 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpmxk\" (UniqueName: \"kubernetes.io/projected/9e2c1596-af9b-4bf5-8dfa-7809f66fed63-kube-api-access-dpmxk\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:19:17.840151 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:19:17.840117 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" path="/var/lib/kubelet/pods/9e2c1596-af9b-4bf5-8dfa-7809f66fed63/volumes" Apr 23 14:20:04.367371 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367337 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87"] Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367695 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="storage-initializer" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367710 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="storage-initializer" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367729 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kube-rbac-proxy" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367737 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kube-rbac-proxy" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367748 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kube-rbac-proxy" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367772 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kube-rbac-proxy" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367784 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367792 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367807 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367815 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367825 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="storage-initializer" Apr 23 14:20:04.367904 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367832 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="storage-initializer" Apr 23 14:20:04.368322 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367913 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kube-rbac-proxy" Apr 23 14:20:04.368322 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367925 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b100e97-f1c7-4541-8195-3ded4e7208ea" containerName="kserve-container" Apr 23 14:20:04.368322 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367935 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kserve-container" Apr 23 14:20:04.368322 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.367945 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e2c1596-af9b-4bf5-8dfa-7809f66fed63" containerName="kube-rbac-proxy" Apr 23 14:20:04.371148 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.371131 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.373863 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.373836 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 23 14:20:04.374187 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.374167 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 23 14:20:04.374281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.374170 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:20:04.374687 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.374672 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:20:04.375383 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.375370 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:20:04.384059 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.384038 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87"] Apr 23 14:20:04.458428 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.458394 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.458604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.458433 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc2cq\" (UniqueName: \"kubernetes.io/projected/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kube-api-access-cc2cq\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.458604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.458469 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.458604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.458550 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.559344 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.559318 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.559526 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.559377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.559526 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.559406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc2cq\" (UniqueName: \"kubernetes.io/projected/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kube-api-access-cc2cq\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.559526 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.559429 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.559526 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:20:04.559468 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 23 14:20:04.559709 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:20:04.559536 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls podName:7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca nodeName:}" failed. No retries permitted until 2026-04-23 14:20:05.059515544 +0000 UTC m=+2915.798059492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" (UID: "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 23 14:20:04.559902 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.559880 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.560125 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.560109 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:04.568726 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:04.568696 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc2cq\" (UniqueName: \"kubernetes.io/projected/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kube-api-access-cc2cq\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:05.063893 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:05.063851 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:05.066218 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:05.066195 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-r8d87\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:05.282714 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:05.282676 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:05.402900 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:05.402877 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87"] Apr 23 14:20:05.404993 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:20:05.404963 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dea03d0_03be_4e4b_a2b6_80aa2a3c80ca.slice/crio-2a97e0e94fadecd56c139796eed0f903f244d650b2aa1045c1c98e159845d6a6 WatchSource:0}: Error finding container 2a97e0e94fadecd56c139796eed0f903f244d650b2aa1045c1c98e159845d6a6: Status 404 returned error can't find the container with id 2a97e0e94fadecd56c139796eed0f903f244d650b2aa1045c1c98e159845d6a6 Apr 23 14:20:05.547734 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:05.547698 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerStarted","Data":"dbabbaf1bd92363b38c3136c1e1594ffd0112a082d54f80f7d500deb92e58fd0"} Apr 23 14:20:05.547734 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:05.547737 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerStarted","Data":"2a97e0e94fadecd56c139796eed0f903f244d650b2aa1045c1c98e159845d6a6"} Apr 23 14:20:10.566766 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:10.566731 2565 generic.go:358] "Generic (PLEG): container finished" podID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerID="dbabbaf1bd92363b38c3136c1e1594ffd0112a082d54f80f7d500deb92e58fd0" exitCode=0 Apr 23 14:20:10.567188 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:10.566809 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerDied","Data":"dbabbaf1bd92363b38c3136c1e1594ffd0112a082d54f80f7d500deb92e58fd0"} Apr 23 14:20:15.590523 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:15.590485 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerStarted","Data":"2ff67d0d26d7557ab5463fab4a67208d2f4c19ff374c746da2ad72a1b16063da"} Apr 23 14:20:15.590523 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:15.590524 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerStarted","Data":"6c7801fa0d12a275922c7daf6a3893cdafa9128f6bea3aaaabac8e0dd2679d8f"} Apr 23 14:20:15.591086 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:15.590752 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:15.609306 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:15.609255 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podStartSLOduration=7.23448748 podStartE2EDuration="11.609242467s" podCreationTimestamp="2026-04-23 14:20:04 +0000 UTC" firstStartedPulling="2026-04-23 14:20:10.56799108 +0000 UTC m=+2921.306535025" lastFinishedPulling="2026-04-23 14:20:14.942746064 +0000 UTC m=+2925.681290012" observedRunningTime="2026-04-23 14:20:15.608051492 +0000 UTC m=+2926.346595513" watchObservedRunningTime="2026-04-23 14:20:15.609242467 +0000 UTC m=+2926.347786479" Apr 23 14:20:16.593827 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:16.593795 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:16.595165 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:16.595133 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 23 14:20:17.596916 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:17.596877 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 23 14:20:22.601472 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:22.601445 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:22.601949 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:22.601924 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 23 14:20:32.602329 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:32.602301 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:20:45.273981 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:45.273943 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87"] Apr 23 14:20:45.274519 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:45.274270 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" containerID="cri-o://6c7801fa0d12a275922c7daf6a3893cdafa9128f6bea3aaaabac8e0dd2679d8f" gracePeriod=30 Apr 23 14:20:45.274519 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:45.274309 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" containerID="cri-o://2ff67d0d26d7557ab5463fab4a67208d2f4c19ff374c746da2ad72a1b16063da" gracePeriod=30 Apr 23 14:20:45.686340 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:45.686308 2565 generic.go:358] "Generic (PLEG): container finished" podID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerID="2ff67d0d26d7557ab5463fab4a67208d2f4c19ff374c746da2ad72a1b16063da" exitCode=2 Apr 23 14:20:45.686520 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:45.686368 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerDied","Data":"2ff67d0d26d7557ab5463fab4a67208d2f4c19ff374c746da2ad72a1b16063da"} Apr 23 14:20:47.597090 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:47.597039 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 14:20:52.597825 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:52.597720 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 14:20:57.597599 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:57.597558 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 14:20:57.598063 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:20:57.597715 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:21:02.597932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:02.597883 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 14:21:07.597602 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:07.597557 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 14:21:12.597341 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:12.597297 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 14:21:15.793416 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.793384 2565 generic.go:358] "Generic (PLEG): container finished" podID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerID="6c7801fa0d12a275922c7daf6a3893cdafa9128f6bea3aaaabac8e0dd2679d8f" exitCode=137 Apr 23 14:21:15.793723 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.793430 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerDied","Data":"6c7801fa0d12a275922c7daf6a3893cdafa9128f6bea3aaaabac8e0dd2679d8f"} Apr 23 14:21:15.919540 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.919513 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:21:15.942944 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.942908 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kserve-provision-location\") pod \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " Apr 23 14:21:15.943087 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.942949 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls\") pod \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " Apr 23 14:21:15.943087 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.942986 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc2cq\" (UniqueName: \"kubernetes.io/projected/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kube-api-access-cc2cq\") pod \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " Apr 23 14:21:15.943087 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.943044 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\" (UID: \"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca\") " Apr 23 14:21:15.943775 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.943731 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" (UID: "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:21:15.945338 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.945309 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" (UID: "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:21:15.945527 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.945499 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kube-api-access-cc2cq" (OuterVolumeSpecName: "kube-api-access-cc2cq") pod "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" (UID: "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca"). InnerVolumeSpecName "kube-api-access-cc2cq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:21:15.954240 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:15.954212 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" (UID: "7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:21:16.044520 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.044440 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:21:16.044520 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.044467 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:21:16.044520 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.044480 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:21:16.044520 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.044493 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cc2cq\" (UniqueName: \"kubernetes.io/projected/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca-kube-api-access-cc2cq\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:21:16.798026 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.797985 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" event={"ID":"7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca","Type":"ContainerDied","Data":"2a97e0e94fadecd56c139796eed0f903f244d650b2aa1045c1c98e159845d6a6"} Apr 23 14:21:16.798531 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.798042 2565 scope.go:117] "RemoveContainer" containerID="2ff67d0d26d7557ab5463fab4a67208d2f4c19ff374c746da2ad72a1b16063da" Apr 23 14:21:16.798531 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.798062 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87" Apr 23 14:21:16.806021 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.806002 2565 scope.go:117] "RemoveContainer" containerID="6c7801fa0d12a275922c7daf6a3893cdafa9128f6bea3aaaabac8e0dd2679d8f" Apr 23 14:21:16.813373 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.813351 2565 scope.go:117] "RemoveContainer" containerID="dbabbaf1bd92363b38c3136c1e1594ffd0112a082d54f80f7d500deb92e58fd0" Apr 23 14:21:16.820943 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.820922 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87"] Apr 23 14:21:16.824731 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:16.824711 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-r8d87"] Apr 23 14:21:17.840222 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:17.840188 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" path="/var/lib/kubelet/pods/7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca/volumes" Apr 23 14:21:29.935990 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:29.935956 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:21:29.945724 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:21:29.945697 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:22:57.644159 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644081 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5"] Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644439 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644451 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644460 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="storage-initializer" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644465 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="storage-initializer" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644475 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644482 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644536 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kserve-container" Apr 23 14:22:57.644617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.644544 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dea03d0-03be-4e4b-a2b6-80aa2a3c80ca" containerName="kube-rbac-proxy" Apr 23 14:22:57.647702 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.647686 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.650413 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.650393 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 23 14:22:57.650967 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.650950 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:22:57.651896 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.651877 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:22:57.651992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.651880 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:22:57.651992 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.651880 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 23 14:22:57.655746 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.655726 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5"] Apr 23 14:22:57.769497 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.769460 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.769658 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.769502 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dvp\" (UniqueName: \"kubernetes.io/projected/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kube-api-access-c5dvp\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.769658 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.769527 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.769658 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.769606 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.870913 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.870880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.871080 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.871045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.871132 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.871090 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dvp\" (UniqueName: \"kubernetes.io/projected/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kube-api-access-c5dvp\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.871132 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.871122 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.871294 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:22:57.871274 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 23 14:22:57.871365 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.871310 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.871365 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:22:57.871344 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls podName:2a6ebd8c-8e47-47a8-82d7-e7361aab3c02 nodeName:}" failed. No retries permitted until 2026-04-23 14:22:58.371325382 +0000 UTC m=+3089.109869329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-xjcl5" (UID: "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 23 14:22:57.871597 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.871577 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:57.881083 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:57.881060 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dvp\" (UniqueName: \"kubernetes.io/projected/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kube-api-access-c5dvp\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:58.375416 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:58.375378 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:58.377779 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:58.377746 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-xjcl5\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:58.558203 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:58.558165 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:22:58.677680 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:58.677618 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5"] Apr 23 14:22:58.680371 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:22:58.680344 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a6ebd8c_8e47_47a8_82d7_e7361aab3c02.slice/crio-c4997803d45552b6b141db76690ccdc759d09fb7f025a98b67e51375d21cc821 WatchSource:0}: Error finding container c4997803d45552b6b141db76690ccdc759d09fb7f025a98b67e51375d21cc821: Status 404 returned error can't find the container with id c4997803d45552b6b141db76690ccdc759d09fb7f025a98b67e51375d21cc821 Apr 23 14:22:58.682517 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:58.682503 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:22:59.132079 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:59.132043 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerStarted","Data":"c2acb1c49dee31bf6b3d98c21dc7ecb838a5c55d82955b98bd35ec9a5e9e4e46"} Apr 23 14:22:59.132244 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:22:59.132083 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerStarted","Data":"c4997803d45552b6b141db76690ccdc759d09fb7f025a98b67e51375d21cc821"} Apr 23 14:23:03.151255 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:03.151223 2565 generic.go:358] "Generic (PLEG): container finished" podID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerID="c2acb1c49dee31bf6b3d98c21dc7ecb838a5c55d82955b98bd35ec9a5e9e4e46" exitCode=0 Apr 23 14:23:03.151632 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:03.151280 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerDied","Data":"c2acb1c49dee31bf6b3d98c21dc7ecb838a5c55d82955b98bd35ec9a5e9e4e46"} Apr 23 14:23:22.223711 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:22.223674 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerStarted","Data":"fa274e82537014a75cd0374d3e0f8e6e9cb30ede53ae7af6772f29e7378969f4"} Apr 23 14:23:22.223711 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:22.223714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerStarted","Data":"bd44877c2b9f3793442deca8caf8c19d835a4e9a2da94fed540dd4f7077061de"} Apr 23 14:23:22.224231 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:22.224005 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:23:22.224231 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:22.224104 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:23:22.225264 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:22.225242 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:23:22.245921 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:22.245832 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podStartSLOduration=6.405261799 podStartE2EDuration="25.245804899s" podCreationTimestamp="2026-04-23 14:22:57 +0000 UTC" firstStartedPulling="2026-04-23 14:23:03.152523943 +0000 UTC m=+3093.891067887" lastFinishedPulling="2026-04-23 14:23:21.993067038 +0000 UTC m=+3112.731610987" observedRunningTime="2026-04-23 14:23:22.24462095 +0000 UTC m=+3112.983164925" watchObservedRunningTime="2026-04-23 14:23:22.245804899 +0000 UTC m=+3112.984348856" Apr 23 14:23:23.226820 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:23.226781 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:23:28.231812 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:28.231783 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:23:28.232360 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:28.232329 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:23:38.233084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:38.233040 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:23:48.232447 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:48.232408 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:23:58.232406 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:23:58.232317 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:24:08.232505 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:08.232460 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:24:18.232690 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:18.232650 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:24:28.232911 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:28.232879 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:24:37.768062 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:37.768026 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5"] Apr 23 14:24:37.768581 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:37.768420 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" containerID="cri-o://bd44877c2b9f3793442deca8caf8c19d835a4e9a2da94fed540dd4f7077061de" gracePeriod=30 Apr 23 14:24:37.768581 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:37.768510 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kube-rbac-proxy" containerID="cri-o://fa274e82537014a75cd0374d3e0f8e6e9cb30ede53ae7af6772f29e7378969f4" gracePeriod=30 Apr 23 14:24:38.228114 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:38.228068 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 23 14:24:38.232457 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:38.232431 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 14:24:38.468154 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:38.468112 2565 generic.go:358] "Generic (PLEG): container finished" podID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerID="fa274e82537014a75cd0374d3e0f8e6e9cb30ede53ae7af6772f29e7378969f4" exitCode=2 Apr 23 14:24:38.468330 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:38.468185 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerDied","Data":"fa274e82537014a75cd0374d3e0f8e6e9cb30ede53ae7af6772f29e7378969f4"} Apr 23 14:24:41.480816 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.480755 2565 generic.go:358] "Generic (PLEG): container finished" podID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerID="bd44877c2b9f3793442deca8caf8c19d835a4e9a2da94fed540dd4f7077061de" exitCode=0 Apr 23 14:24:41.480816 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.480798 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerDied","Data":"bd44877c2b9f3793442deca8caf8c19d835a4e9a2da94fed540dd4f7077061de"} Apr 23 14:24:41.514746 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.514721 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:24:41.516988 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.516967 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kserve-provision-location\") pod \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " Apr 23 14:24:41.517046 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.517027 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " Apr 23 14:24:41.517083 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.517055 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5dvp\" (UniqueName: \"kubernetes.io/projected/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kube-api-access-c5dvp\") pod \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " Apr 23 14:24:41.517083 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.517077 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls\") pod \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\" (UID: \"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02\") " Apr 23 14:24:41.517318 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.517294 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" (UID: "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:24:41.517440 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.517409 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" (UID: "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:24:41.519154 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.519132 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kube-api-access-c5dvp" (OuterVolumeSpecName: "kube-api-access-c5dvp") pod "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" (UID: "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02"). InnerVolumeSpecName "kube-api-access-c5dvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:24:41.519154 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.519136 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" (UID: "2a6ebd8c-8e47-47a8-82d7-e7361aab3c02"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:24:41.617657 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.617563 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:24:41.617657 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.617599 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c5dvp\" (UniqueName: \"kubernetes.io/projected/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kube-api-access-c5dvp\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:24:41.617657 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.617610 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:24:41.617657 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:41.617619 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:24:42.485932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.485904 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" Apr 23 14:24:42.486371 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.485900 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5" event={"ID":"2a6ebd8c-8e47-47a8-82d7-e7361aab3c02","Type":"ContainerDied","Data":"c4997803d45552b6b141db76690ccdc759d09fb7f025a98b67e51375d21cc821"} Apr 23 14:24:42.486371 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.486030 2565 scope.go:117] "RemoveContainer" containerID="fa274e82537014a75cd0374d3e0f8e6e9cb30ede53ae7af6772f29e7378969f4" Apr 23 14:24:42.493773 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.493735 2565 scope.go:117] "RemoveContainer" containerID="bd44877c2b9f3793442deca8caf8c19d835a4e9a2da94fed540dd4f7077061de" Apr 23 14:24:42.500674 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.500653 2565 scope.go:117] "RemoveContainer" containerID="c2acb1c49dee31bf6b3d98c21dc7ecb838a5c55d82955b98bd35ec9a5e9e4e46" Apr 23 14:24:42.505109 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.505088 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5"] Apr 23 14:24:42.509542 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:42.509522 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-xjcl5"] Apr 23 14:24:43.842689 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:24:43.842651 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" path="/var/lib/kubelet/pods/2a6ebd8c-8e47-47a8-82d7-e7361aab3c02/volumes" Apr 23 14:26:29.961935 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:26:29.961901 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:26:29.970344 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:26:29.970321 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:30:18.697212 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697176 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn"] Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697527 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697541 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697554 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="storage-initializer" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697560 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="storage-initializer" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697567 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kube-rbac-proxy" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697572 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kube-rbac-proxy" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697620 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kserve-container" Apr 23 14:30:18.698137 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.697630 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a6ebd8c-8e47-47a8-82d7-e7361aab3c02" containerName="kube-rbac-proxy" Apr 23 14:30:18.700749 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.700731 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.703323 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.703304 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 23 14:30:18.703392 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.703321 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:30:18.703614 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.703595 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 23 14:30:18.703695 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.703644 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 23 14:30:18.703695 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.703660 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:30:18.703695 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.703675 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:30:18.711688 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.711668 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn"] Apr 23 14:30:18.750741 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.750710 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7be50eb8-b001-45e9-96c0-5e4fab78d57b-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.750923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.750835 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.750923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.750867 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kssf\" (UniqueName: \"kubernetes.io/projected/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kube-api-access-8kssf\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.750923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.750912 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.852030 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.851999 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.852204 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.852043 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7be50eb8-b001-45e9-96c0-5e4fab78d57b-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.852204 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.852095 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.852204 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.852111 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kssf\" (UniqueName: \"kubernetes.io/projected/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kube-api-access-8kssf\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.852204 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:30:18.852199 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 23 14:30:18.852471 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:30:18.852255 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls podName:7be50eb8-b001-45e9-96c0-5e4fab78d57b nodeName:}" failed. No retries permitted until 2026-04-23 14:30:19.352233284 +0000 UTC m=+3530.090777230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls") pod "isvc-sklearn-s3-predictor-5d9949bc59-68dwn" (UID: "7be50eb8-b001-45e9-96c0-5e4fab78d57b") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 23 14:30:18.852471 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.852427 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.852709 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.852690 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7be50eb8-b001-45e9-96c0-5e4fab78d57b-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:18.862978 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:18.862954 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kssf\" (UniqueName: \"kubernetes.io/projected/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kube-api-access-8kssf\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:19.355341 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:19.355303 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:19.357629 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:19.357600 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5d9949bc59-68dwn\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:19.610940 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:19.610859 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:19.733873 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:19.733851 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn"] Apr 23 14:30:19.735822 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:30:19.735789 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be50eb8_b001_45e9_96c0_5e4fab78d57b.slice/crio-23d47dc8655c44294cf96410acc4cac556946d951e45a1cdc87a7cd292eb4661 WatchSource:0}: Error finding container 23d47dc8655c44294cf96410acc4cac556946d951e45a1cdc87a7cd292eb4661: Status 404 returned error can't find the container with id 23d47dc8655c44294cf96410acc4cac556946d951e45a1cdc87a7cd292eb4661 Apr 23 14:30:19.738037 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:19.738022 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:30:20.568359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:20.568324 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerStarted","Data":"1c78d99f2ed873e8cb9dd593b56525032f8e0c368d8ab8409c4272f759bd6f2c"} Apr 23 14:30:20.568359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:20.568361 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerStarted","Data":"23d47dc8655c44294cf96410acc4cac556946d951e45a1cdc87a7cd292eb4661"} Apr 23 14:30:21.573238 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:21.573199 2565 generic.go:358] "Generic (PLEG): container finished" podID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerID="1c78d99f2ed873e8cb9dd593b56525032f8e0c368d8ab8409c4272f759bd6f2c" exitCode=0 Apr 23 14:30:21.573619 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:21.573277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerDied","Data":"1c78d99f2ed873e8cb9dd593b56525032f8e0c368d8ab8409c4272f759bd6f2c"} Apr 23 14:30:22.578432 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:22.578398 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerStarted","Data":"4edf6058c3d1d1dcad0b370bf1bb59897855b5d5d8b77f57e97a6789abeabcf1"} Apr 23 14:30:22.578432 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:22.578433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerStarted","Data":"6ae1349d1a075716b44348ddcf5d76fe57976d1319e20d7d3590544f1d6fdf76"} Apr 23 14:30:22.578857 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:22.578571 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:22.599834 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:22.599791 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podStartSLOduration=4.599779765 podStartE2EDuration="4.599779765s" podCreationTimestamp="2026-04-23 14:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:30:22.598928738 +0000 UTC m=+3533.337472707" watchObservedRunningTime="2026-04-23 14:30:22.599779765 +0000 UTC m=+3533.338323730" Apr 23 14:30:23.581451 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:23.581419 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:23.582860 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:23.582833 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:30:24.584159 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:24.584116 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:30:29.589307 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:29.589275 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:30:29.589793 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:29.589751 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:30:39.590699 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:39.590661 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:30:49.590028 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:49.589990 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:30:59.590367 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:30:59.590323 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:31:09.590253 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:09.590213 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:31:19.590823 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:19.590719 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:31:29.590719 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:29.590686 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:31:29.986552 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:29.986522 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:31:29.994325 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:29.994301 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:31:38.814249 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.814211 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn"] Apr 23 14:31:38.814779 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.814518 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" containerID="cri-o://6ae1349d1a075716b44348ddcf5d76fe57976d1319e20d7d3590544f1d6fdf76" gracePeriod=30 Apr 23 14:31:38.814779 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.814559 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kube-rbac-proxy" containerID="cri-o://4edf6058c3d1d1dcad0b370bf1bb59897855b5d5d8b77f57e97a6789abeabcf1" gracePeriod=30 Apr 23 14:31:38.904688 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.904657 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn"] Apr 23 14:31:38.908115 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.908100 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:38.910852 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.910830 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 14:31:38.910947 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.910848 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 23 14:31:38.910947 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.910874 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 23 14:31:38.918520 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.918497 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn"] Apr 23 14:31:38.995010 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.994987 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc95e51-eda4-4ce7-832b-655e13495988-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:38.995112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.995016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:38.995112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.995085 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:38.995193 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.995145 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:38.995193 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:38.995177 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzxxx\" (UniqueName: \"kubernetes.io/projected/bcc95e51-eda4-4ce7-832b-655e13495988-kube-api-access-nzxxx\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.095980 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096042 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096038 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzxxx\" (UniqueName: \"kubernetes.io/projected/bcc95e51-eda4-4ce7-832b-655e13495988-kube-api-access-nzxxx\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096263 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096083 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc95e51-eda4-4ce7-832b-655e13495988-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096263 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096263 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:31:39.096130 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 23 14:31:39.096263 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:31:39.096187 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls podName:bcc95e51-eda4-4ce7-832b-655e13495988 nodeName:}" failed. No retries permitted until 2026-04-23 14:31:39.596171082 +0000 UTC m=+3610.334715026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls") pod "isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" (UID: "bcc95e51-eda4-4ce7-832b-655e13495988") : secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 23 14:31:39.096526 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096501 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc95e51-eda4-4ce7-832b-655e13495988-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096785 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096739 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.096857 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.096753 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.105049 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.105027 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzxxx\" (UniqueName: \"kubernetes.io/projected/bcc95e51-eda4-4ce7-832b-655e13495988-kube-api-access-nzxxx\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.585283 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.585245 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.44:8643/healthz\": dial tcp 10.132.0.44:8643: connect: connection refused" Apr 23 14:31:39.590653 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.590631 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 14:31:39.600364 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.600341 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.602685 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.602663 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.817540 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.817512 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:39.825406 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.825380 2565 generic.go:358] "Generic (PLEG): container finished" podID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerID="4edf6058c3d1d1dcad0b370bf1bb59897855b5d5d8b77f57e97a6789abeabcf1" exitCode=2 Apr 23 14:31:39.825518 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.825442 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerDied","Data":"4edf6058c3d1d1dcad0b370bf1bb59897855b5d5d8b77f57e97a6789abeabcf1"} Apr 23 14:31:39.965649 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:39.965566 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn"] Apr 23 14:31:39.968454 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:31:39.968426 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc95e51_eda4_4ce7_832b_655e13495988.slice/crio-7184455e31d6496cf23f27ea05b6edbb453d7038d0c5b8d023ab347d8f42e354 WatchSource:0}: Error finding container 7184455e31d6496cf23f27ea05b6edbb453d7038d0c5b8d023ab347d8f42e354: Status 404 returned error can't find the container with id 7184455e31d6496cf23f27ea05b6edbb453d7038d0c5b8d023ab347d8f42e354 Apr 23 14:31:40.830442 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:40.830403 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerStarted","Data":"3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e"} Apr 23 14:31:40.830442 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:40.830444 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerStarted","Data":"7184455e31d6496cf23f27ea05b6edbb453d7038d0c5b8d023ab347d8f42e354"} Apr 23 14:31:41.836008 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:41.835974 2565 generic.go:358] "Generic (PLEG): container finished" podID="bcc95e51-eda4-4ce7-832b-655e13495988" containerID="3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e" exitCode=0 Apr 23 14:31:41.839511 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:41.839481 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerDied","Data":"3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e"} Apr 23 14:31:42.842098 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.842068 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerStarted","Data":"dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb"} Apr 23 14:31:42.842455 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.842110 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerStarted","Data":"fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55"} Apr 23 14:31:42.842455 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.842266 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:42.842455 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.842286 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:42.843483 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.843446 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:31:42.844299 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.844276 2565 generic.go:358] "Generic (PLEG): container finished" podID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerID="6ae1349d1a075716b44348ddcf5d76fe57976d1319e20d7d3590544f1d6fdf76" exitCode=0 Apr 23 14:31:42.844377 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.844319 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerDied","Data":"6ae1349d1a075716b44348ddcf5d76fe57976d1319e20d7d3590544f1d6fdf76"} Apr 23 14:31:42.854224 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.854207 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:31:42.866579 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.866543 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podStartSLOduration=4.866531934 podStartE2EDuration="4.866531934s" podCreationTimestamp="2026-04-23 14:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:31:42.864582833 +0000 UTC m=+3613.603126798" watchObservedRunningTime="2026-04-23 14:31:42.866531934 +0000 UTC m=+3613.605075900" Apr 23 14:31:42.931111 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931089 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kssf\" (UniqueName: \"kubernetes.io/projected/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kube-api-access-8kssf\") pod \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " Apr 23 14:31:42.931204 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931150 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7be50eb8-b001-45e9-96c0-5e4fab78d57b-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " Apr 23 14:31:42.931204 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931181 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls\") pod \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " Apr 23 14:31:42.931342 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931252 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kserve-provision-location\") pod \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\" (UID: \"7be50eb8-b001-45e9-96c0-5e4fab78d57b\") " Apr 23 14:31:42.931572 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931545 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be50eb8-b001-45e9-96c0-5e4fab78d57b-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "7be50eb8-b001-45e9-96c0-5e4fab78d57b" (UID: "7be50eb8-b001-45e9-96c0-5e4fab78d57b"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:31:42.931675 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931597 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7be50eb8-b001-45e9-96c0-5e4fab78d57b" (UID: "7be50eb8-b001-45e9-96c0-5e4fab78d57b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:31:42.931743 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931721 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7be50eb8-b001-45e9-96c0-5e4fab78d57b-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:31:42.931838 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.931740 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:31:42.933246 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.933221 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kube-api-access-8kssf" (OuterVolumeSpecName: "kube-api-access-8kssf") pod "7be50eb8-b001-45e9-96c0-5e4fab78d57b" (UID: "7be50eb8-b001-45e9-96c0-5e4fab78d57b"). InnerVolumeSpecName "kube-api-access-8kssf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:31:42.933414 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:42.933395 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7be50eb8-b001-45e9-96c0-5e4fab78d57b" (UID: "7be50eb8-b001-45e9-96c0-5e4fab78d57b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:31:43.032211 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.032185 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7be50eb8-b001-45e9-96c0-5e4fab78d57b-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:31:43.032211 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.032208 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kssf\" (UniqueName: \"kubernetes.io/projected/7be50eb8-b001-45e9-96c0-5e4fab78d57b-kube-api-access-8kssf\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:31:43.848989 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.848967 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" Apr 23 14:31:43.849370 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.848963 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn" event={"ID":"7be50eb8-b001-45e9-96c0-5e4fab78d57b","Type":"ContainerDied","Data":"23d47dc8655c44294cf96410acc4cac556946d951e45a1cdc87a7cd292eb4661"} Apr 23 14:31:43.849370 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.849076 2565 scope.go:117] "RemoveContainer" containerID="4edf6058c3d1d1dcad0b370bf1bb59897855b5d5d8b77f57e97a6789abeabcf1" Apr 23 14:31:43.849726 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.849695 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:31:43.856903 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.856886 2565 scope.go:117] "RemoveContainer" containerID="6ae1349d1a075716b44348ddcf5d76fe57976d1319e20d7d3590544f1d6fdf76" Apr 23 14:31:43.863683 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.863667 2565 scope.go:117] "RemoveContainer" containerID="1c78d99f2ed873e8cb9dd593b56525032f8e0c368d8ab8409c4272f759bd6f2c" Apr 23 14:31:43.868035 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.868016 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn"] Apr 23 14:31:43.871815 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:43.871796 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5d9949bc59-68dwn"] Apr 23 14:31:45.841273 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:45.841231 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" path="/var/lib/kubelet/pods/7be50eb8-b001-45e9-96c0-5e4fab78d57b/volumes" Apr 23 14:31:48.853646 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:48.853619 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:31:48.854247 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:48.854212 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:31:58.854952 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:31:58.854917 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:32:08.854637 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:08.854586 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:32:18.854991 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:18.854949 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:32:28.854312 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:28.854273 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:32:38.854207 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:38.854172 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 23 14:32:48.854940 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:48.854906 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:32:59.023256 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:59.023217 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn"] Apr 23 14:32:59.023750 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:59.023695 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" containerID="cri-o://fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55" gracePeriod=30 Apr 23 14:32:59.023867 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:32:59.023792 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kube-rbac-proxy" containerID="cri-o://dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb" gracePeriod=30 Apr 23 14:33:00.043358 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043319 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk"] Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043658 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043669 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043678 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="storage-initializer" Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043684 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="storage-initializer" Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043694 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kube-rbac-proxy" Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043700 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kube-rbac-proxy" Apr 23 14:33:00.043752 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043773 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kube-rbac-proxy" Apr 23 14:33:00.044015 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.043785 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7be50eb8-b001-45e9-96c0-5e4fab78d57b" containerName="kserve-container" Apr 23 14:33:00.046986 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.046969 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.050679 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.050654 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 23 14:33:00.050679 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.050669 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 23 14:33:00.061197 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.061171 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk"] Apr 23 14:33:00.104142 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.104107 2565 generic.go:358] "Generic (PLEG): container finished" podID="bcc95e51-eda4-4ce7-832b-655e13495988" containerID="dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb" exitCode=2 Apr 23 14:33:00.104333 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.104139 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerDied","Data":"dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb"} Apr 23 14:33:00.151135 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.151097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c6437b7-859c-4262-9bf0-4a43c225af01-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.151135 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.151142 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7sz\" (UniqueName: \"kubernetes.io/projected/7c6437b7-859c-4262-9bf0-4a43c225af01-kube-api-access-xt7sz\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.151415 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.151212 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6437b7-859c-4262-9bf0-4a43c225af01-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.151415 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.151301 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.252663 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.252616 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c6437b7-859c-4262-9bf0-4a43c225af01-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.252663 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.252667 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7sz\" (UniqueName: \"kubernetes.io/projected/7c6437b7-859c-4262-9bf0-4a43c225af01-kube-api-access-xt7sz\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.252998 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.252705 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6437b7-859c-4262-9bf0-4a43c225af01-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.252998 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.252753 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.252998 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:00.252933 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 23 14:33:00.252998 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:00.252996 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls podName:7c6437b7-859c-4262-9bf0-4a43c225af01 nodeName:}" failed. No retries permitted until 2026-04-23 14:33:00.752973908 +0000 UTC m=+3691.491517864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" (UID: "7c6437b7-859c-4262-9bf0-4a43c225af01") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 23 14:33:00.253235 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.253143 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6437b7-859c-4262-9bf0-4a43c225af01-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.253361 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.253342 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c6437b7-859c-4262-9bf0-4a43c225af01-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.262463 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.262436 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7sz\" (UniqueName: \"kubernetes.io/projected/7c6437b7-859c-4262-9bf0-4a43c225af01-kube-api-access-xt7sz\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.758602 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.758568 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.761166 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.761130 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:00.956878 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:00.956832 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:01.090617 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:01.090583 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk"] Apr 23 14:33:01.093003 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:33:01.092969 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6437b7_859c_4262_9bf0_4a43c225af01.slice/crio-047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f WatchSource:0}: Error finding container 047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f: Status 404 returned error can't find the container with id 047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f Apr 23 14:33:01.109300 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:01.109270 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" event={"ID":"7c6437b7-859c-4262-9bf0-4a43c225af01","Type":"ContainerStarted","Data":"047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f"} Apr 23 14:33:02.114021 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:02.113977 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" event={"ID":"7c6437b7-859c-4262-9bf0-4a43c225af01","Type":"ContainerStarted","Data":"d5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90"} Apr 23 14:33:03.671061 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.671037 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:33:03.783741 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.783650 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls\") pod \"bcc95e51-eda4-4ce7-832b-655e13495988\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " Apr 23 14:33:03.783741 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.783694 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc95e51-eda4-4ce7-832b-655e13495988-kserve-provision-location\") pod \"bcc95e51-eda4-4ce7-832b-655e13495988\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " Apr 23 14:33:03.783981 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.783749 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"bcc95e51-eda4-4ce7-832b-655e13495988\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " Apr 23 14:33:03.783981 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.783788 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzxxx\" (UniqueName: \"kubernetes.io/projected/bcc95e51-eda4-4ce7-832b-655e13495988-kube-api-access-nzxxx\") pod \"bcc95e51-eda4-4ce7-832b-655e13495988\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " Apr 23 14:33:03.783981 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.783817 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-cabundle-cert\") pod \"bcc95e51-eda4-4ce7-832b-655e13495988\" (UID: \"bcc95e51-eda4-4ce7-832b-655e13495988\") " Apr 23 14:33:03.784238 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.784203 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc95e51-eda4-4ce7-832b-655e13495988-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bcc95e51-eda4-4ce7-832b-655e13495988" (UID: "bcc95e51-eda4-4ce7-832b-655e13495988"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:33:03.784369 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.784245 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "bcc95e51-eda4-4ce7-832b-655e13495988" (UID: "bcc95e51-eda4-4ce7-832b-655e13495988"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:33:03.784369 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.784264 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "bcc95e51-eda4-4ce7-832b-655e13495988" (UID: "bcc95e51-eda4-4ce7-832b-655e13495988"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:33:03.786053 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.786030 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc95e51-eda4-4ce7-832b-655e13495988-kube-api-access-nzxxx" (OuterVolumeSpecName: "kube-api-access-nzxxx") pod "bcc95e51-eda4-4ce7-832b-655e13495988" (UID: "bcc95e51-eda4-4ce7-832b-655e13495988"). InnerVolumeSpecName "kube-api-access-nzxxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:33:03.786205 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.786187 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bcc95e51-eda4-4ce7-832b-655e13495988" (UID: "bcc95e51-eda4-4ce7-832b-655e13495988"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:33:03.884601 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.884559 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcc95e51-eda4-4ce7-832b-655e13495988-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:03.884601 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.884591 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcc95e51-eda4-4ce7-832b-655e13495988-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:03.884601 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.884603 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:03.884872 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.884615 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzxxx\" (UniqueName: \"kubernetes.io/projected/bcc95e51-eda4-4ce7-832b-655e13495988-kube-api-access-nzxxx\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:03.884872 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:03.884625 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bcc95e51-eda4-4ce7-832b-655e13495988-cabundle-cert\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:04.122623 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.122518 2565 generic.go:358] "Generic (PLEG): container finished" podID="bcc95e51-eda4-4ce7-832b-655e13495988" containerID="fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55" exitCode=0 Apr 23 14:33:04.122623 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.122598 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerDied","Data":"fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55"} Apr 23 14:33:04.122915 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.122647 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" event={"ID":"bcc95e51-eda4-4ce7-832b-655e13495988","Type":"ContainerDied","Data":"7184455e31d6496cf23f27ea05b6edbb453d7038d0c5b8d023ab347d8f42e354"} Apr 23 14:33:04.122915 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.122664 2565 scope.go:117] "RemoveContainer" containerID="dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb" Apr 23 14:33:04.122915 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.122604 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn" Apr 23 14:33:04.130694 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.130674 2565 scope.go:117] "RemoveContainer" containerID="fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55" Apr 23 14:33:04.139855 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.139836 2565 scope.go:117] "RemoveContainer" containerID="3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e" Apr 23 14:33:04.141634 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.141613 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn"] Apr 23 14:33:04.143155 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.143134 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-86f54c547-k8fcn"] Apr 23 14:33:04.147424 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.147408 2565 scope.go:117] "RemoveContainer" containerID="dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb" Apr 23 14:33:04.147686 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:04.147659 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb\": container with ID starting with dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb not found: ID does not exist" containerID="dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb" Apr 23 14:33:04.147740 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.147685 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb"} err="failed to get container status \"dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb\": rpc error: code = NotFound desc = could not find container \"dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb\": container with ID starting with dff2562c7b0a78d1ad6ed8620369b7f513b1f7306fe9115a9696ba0ce27381fb not found: ID does not exist" Apr 23 14:33:04.147740 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.147704 2565 scope.go:117] "RemoveContainer" containerID="fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55" Apr 23 14:33:04.147951 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:04.147926 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55\": container with ID starting with fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55 not found: ID does not exist" containerID="fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55" Apr 23 14:33:04.147991 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.147958 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55"} err="failed to get container status \"fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55\": rpc error: code = NotFound desc = could not find container \"fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55\": container with ID starting with fae82ab122b558e325c0f1bcd9fe8647219d113b6e10b1485cfbe50177aeff55 not found: ID does not exist" Apr 23 14:33:04.147991 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.147975 2565 scope.go:117] "RemoveContainer" containerID="3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e" Apr 23 14:33:04.148170 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:04.148149 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e\": container with ID starting with 3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e not found: ID does not exist" containerID="3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e" Apr 23 14:33:04.148214 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:04.148175 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e"} err="failed to get container status \"3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e\": rpc error: code = NotFound desc = could not find container \"3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e\": container with ID starting with 3ff3c0a5a371b9f1f68f7c0a8f87f70851368a095c0f63640f1cb23749d0d29e not found: ID does not exist" Apr 23 14:33:05.841659 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:05.841620 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" path="/var/lib/kubelet/pods/bcc95e51-eda4-4ce7-832b-655e13495988/volumes" Apr 23 14:33:06.130932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:06.130857 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/0.log" Apr 23 14:33:06.130932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:06.130895 2565 generic.go:358] "Generic (PLEG): container finished" podID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerID="d5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90" exitCode=1 Apr 23 14:33:06.131124 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:06.130976 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" event={"ID":"7c6437b7-859c-4262-9bf0-4a43c225af01","Type":"ContainerDied","Data":"d5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90"} Apr 23 14:33:07.136290 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:07.136261 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/0.log" Apr 23 14:33:07.136677 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:07.136322 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" event={"ID":"7c6437b7-859c-4262-9bf0-4a43c225af01","Type":"ContainerStarted","Data":"71801a8fe271d1db500296f18ac55e745a84737e6abceba39b9fd1962bcb91c0"} Apr 23 14:33:09.144359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:09.144333 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/1.log" Apr 23 14:33:09.144745 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:09.144680 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/0.log" Apr 23 14:33:09.144745 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:09.144716 2565 generic.go:358] "Generic (PLEG): container finished" podID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerID="71801a8fe271d1db500296f18ac55e745a84737e6abceba39b9fd1962bcb91c0" exitCode=1 Apr 23 14:33:09.144842 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:09.144797 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" event={"ID":"7c6437b7-859c-4262-9bf0-4a43c225af01","Type":"ContainerDied","Data":"71801a8fe271d1db500296f18ac55e745a84737e6abceba39b9fd1962bcb91c0"} Apr 23 14:33:09.144879 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:09.144840 2565 scope.go:117] "RemoveContainer" containerID="d5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90" Apr 23 14:33:09.145164 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:09.145142 2565 scope.go:117] "RemoveContainer" containerID="d5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90" Apr 23 14:33:09.155321 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:09.155291 2565 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_kserve-ci-e2e-test_7c6437b7-859c-4262-9bf0-4a43c225af01_0 in pod sandbox 047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f from index: no such id: 'd5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90'" containerID="d5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90" Apr 23 14:33:09.155397 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:09.155337 2565 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_kserve-ci-e2e-test_7c6437b7-859c-4262-9bf0-4a43c225af01_0 in pod sandbox 047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f from index: no such id: 'd5cb4003ccd17922cda3b8879f4a37f1baf8a8358f83a4b7cd537d5a15594d90'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_kserve-ci-e2e-test(7c6437b7-859c-4262-9bf0-4a43c225af01)\"" logger="UnhandledError" Apr 23 14:33:09.156662 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:09.156637 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_kserve-ci-e2e-test(7c6437b7-859c-4262-9bf0-4a43c225af01)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" Apr 23 14:33:10.047745 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.047715 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk"] Apr 23 14:33:10.149202 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.149173 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/1.log" Apr 23 14:33:10.283366 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.283343 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/1.log" Apr 23 14:33:10.283498 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.283406 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:10.446702 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.446669 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6437b7-859c-4262-9bf0-4a43c225af01-kserve-provision-location\") pod \"7c6437b7-859c-4262-9bf0-4a43c225af01\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " Apr 23 14:33:10.446924 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.446726 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt7sz\" (UniqueName: \"kubernetes.io/projected/7c6437b7-859c-4262-9bf0-4a43c225af01-kube-api-access-xt7sz\") pod \"7c6437b7-859c-4262-9bf0-4a43c225af01\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " Apr 23 14:33:10.446924 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.446786 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c6437b7-859c-4262-9bf0-4a43c225af01-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"7c6437b7-859c-4262-9bf0-4a43c225af01\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " Apr 23 14:33:10.446924 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.446820 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls\") pod \"7c6437b7-859c-4262-9bf0-4a43c225af01\" (UID: \"7c6437b7-859c-4262-9bf0-4a43c225af01\") " Apr 23 14:33:10.447101 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.447057 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c6437b7-859c-4262-9bf0-4a43c225af01-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c6437b7-859c-4262-9bf0-4a43c225af01" (UID: "7c6437b7-859c-4262-9bf0-4a43c225af01"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:33:10.447148 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.447132 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6437b7-859c-4262-9bf0-4a43c225af01-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "7c6437b7-859c-4262-9bf0-4a43c225af01" (UID: "7c6437b7-859c-4262-9bf0-4a43c225af01"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:33:10.449033 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.449012 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7c6437b7-859c-4262-9bf0-4a43c225af01" (UID: "7c6437b7-859c-4262-9bf0-4a43c225af01"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:33:10.449106 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.449025 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c6437b7-859c-4262-9bf0-4a43c225af01-kube-api-access-xt7sz" (OuterVolumeSpecName: "kube-api-access-xt7sz") pod "7c6437b7-859c-4262-9bf0-4a43c225af01" (UID: "7c6437b7-859c-4262-9bf0-4a43c225af01"). InnerVolumeSpecName "kube-api-access-xt7sz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:33:10.547529 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.547497 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6437b7-859c-4262-9bf0-4a43c225af01-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:10.547529 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.547527 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt7sz\" (UniqueName: \"kubernetes.io/projected/7c6437b7-859c-4262-9bf0-4a43c225af01-kube-api-access-xt7sz\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:10.547748 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.547539 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c6437b7-859c-4262-9bf0-4a43c225af01-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:10.547748 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:10.547548 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6437b7-859c-4262-9bf0-4a43c225af01-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:33:11.126302 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126267 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t"] Apr 23 14:33:11.126636 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126605 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="storage-initializer" Apr 23 14:33:11.126636 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126627 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="storage-initializer" Apr 23 14:33:11.126636 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126637 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126644 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126656 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerName="storage-initializer" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126662 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerName="storage-initializer" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126675 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerName="storage-initializer" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126680 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerName="storage-initializer" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126690 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kube-rbac-proxy" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126697 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kube-rbac-proxy" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126751 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerName="storage-initializer" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126771 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kube-rbac-proxy" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126778 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcc95e51-eda4-4ce7-832b-655e13495988" containerName="kserve-container" Apr 23 14:33:11.126805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.126788 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" containerName="storage-initializer" Apr 23 14:33:11.131149 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.131132 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.134303 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.134283 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 23 14:33:11.134425 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.134302 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 23 14:33:11.134425 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.134321 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 14:33:11.140366 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.140343 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t"] Apr 23 14:33:11.154271 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.154245 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk_7c6437b7-859c-4262-9bf0-4a43c225af01/storage-initializer/1.log" Apr 23 14:33:11.154734 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.154339 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" event={"ID":"7c6437b7-859c-4262-9bf0-4a43c225af01","Type":"ContainerDied","Data":"047dfdc38c5388ebd7723fff1ad43377da8f46dbc69c5c9de6a3fd87306e069f"} Apr 23 14:33:11.154734 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.154374 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk" Apr 23 14:33:11.154734 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.154389 2565 scope.go:117] "RemoveContainer" containerID="71801a8fe271d1db500296f18ac55e745a84737e6abceba39b9fd1962bcb91c0" Apr 23 14:33:11.191498 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.191463 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk"] Apr 23 14:33:11.196649 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.196622 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cc5fddf57-2wcnk"] Apr 23 14:33:11.253415 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.253378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f959c725-7c03-44e1-9a9b-52f56f53fc17-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.253587 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.253483 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.253587 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.253521 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.253587 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.253546 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.253587 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.253569 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx2wv\" (UniqueName: \"kubernetes.io/projected/f959c725-7c03-44e1-9a9b-52f56f53fc17-kube-api-access-wx2wv\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.354646 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.354604 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.354646 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.354651 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.354923 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:11.354747 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 23 14:33:11.354923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.354793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.354923 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:33:11.354841 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls podName:f959c725-7c03-44e1-9a9b-52f56f53fc17 nodeName:}" failed. No retries permitted until 2026-04-23 14:33:11.854820679 +0000 UTC m=+3702.593364623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" (UID: "f959c725-7c03-44e1-9a9b-52f56f53fc17") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 23 14:33:11.354923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.354889 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx2wv\" (UniqueName: \"kubernetes.io/projected/f959c725-7c03-44e1-9a9b-52f56f53fc17-kube-api-access-wx2wv\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.355144 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.354926 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f959c725-7c03-44e1-9a9b-52f56f53fc17-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.355293 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.355269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f959c725-7c03-44e1-9a9b-52f56f53fc17-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.355361 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.355296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.355400 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.355357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.364407 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.364385 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx2wv\" (UniqueName: \"kubernetes.io/projected/f959c725-7c03-44e1-9a9b-52f56f53fc17-kube-api-access-wx2wv\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.839983 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.839939 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c6437b7-859c-4262-9bf0-4a43c225af01" path="/var/lib/kubelet/pods/7c6437b7-859c-4262-9bf0-4a43c225af01/volumes" Apr 23 14:33:11.859562 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.859527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:11.862133 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:11.862108 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:12.042642 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:12.042584 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:12.176842 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:12.176513 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t"] Apr 23 14:33:12.179428 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:33:12.179396 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf959c725_7c03_44e1_9a9b_52f56f53fc17.slice/crio-9ef9535e3908efb7a4da96d5db20bfd2ad5bcf9354e8f0a2a57f189837405e3d WatchSource:0}: Error finding container 9ef9535e3908efb7a4da96d5db20bfd2ad5bcf9354e8f0a2a57f189837405e3d: Status 404 returned error can't find the container with id 9ef9535e3908efb7a4da96d5db20bfd2ad5bcf9354e8f0a2a57f189837405e3d Apr 23 14:33:13.163084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:13.163036 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerStarted","Data":"744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a"} Apr 23 14:33:13.163084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:13.163089 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerStarted","Data":"9ef9535e3908efb7a4da96d5db20bfd2ad5bcf9354e8f0a2a57f189837405e3d"} Apr 23 14:33:14.168660 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:14.168628 2565 generic.go:358] "Generic (PLEG): container finished" podID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerID="744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a" exitCode=0 Apr 23 14:33:14.169085 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:14.168714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerDied","Data":"744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a"} Apr 23 14:33:15.173834 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:15.173788 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerStarted","Data":"ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac"} Apr 23 14:33:15.173834 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:15.173837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerStarted","Data":"e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b"} Apr 23 14:33:15.174278 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:15.174013 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:15.196642 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:15.196592 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podStartSLOduration=4.196577963 podStartE2EDuration="4.196577963s" podCreationTimestamp="2026-04-23 14:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:33:15.193998281 +0000 UTC m=+3705.932542248" watchObservedRunningTime="2026-04-23 14:33:15.196577963 +0000 UTC m=+3705.935121929" Apr 23 14:33:16.177575 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:16.177539 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:16.178632 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:16.178596 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:33:17.180528 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:17.180479 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:33:22.184350 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:22.184321 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:33:22.184942 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:22.184914 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:33:32.185800 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:32.185729 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:33:42.185159 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:42.185117 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:33:52.185603 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:33:52.185559 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:34:02.184939 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:02.184898 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:34:12.185286 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:12.185241 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:34:22.185976 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:22.185900 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:34:31.150933 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:31.150897 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t"] Apr 23 14:34:31.151364 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:31.151336 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" containerID="cri-o://e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b" gracePeriod=30 Apr 23 14:34:31.151423 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:31.151364 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kube-rbac-proxy" containerID="cri-o://ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac" gracePeriod=30 Apr 23 14:34:31.421847 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:31.421816 2565 generic.go:358] "Generic (PLEG): container finished" podID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerID="ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac" exitCode=2 Apr 23 14:34:31.422029 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:31.421887 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerDied","Data":"ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac"} Apr 23 14:34:32.180755 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.180710 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 23 14:34:32.184866 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.184835 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 14:34:32.217459 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.217429 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562"] Apr 23 14:34:32.220863 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.220843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.223580 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.223557 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 23 14:34:32.223685 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.223563 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 23 14:34:32.230326 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.230301 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562"] Apr 23 14:34:32.316559 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.316536 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc6bv\" (UniqueName: \"kubernetes.io/projected/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kube-api-access-sc6bv\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.316694 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.316574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.316694 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.316666 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.316800 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.316702 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.418010 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.417978 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.418010 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.418012 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.418226 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.418049 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sc6bv\" (UniqueName: \"kubernetes.io/projected/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kube-api-access-sc6bv\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.418226 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.418132 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.418547 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.418525 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.418720 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.418696 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.420460 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.420439 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.426614 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.426577 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc6bv\" (UniqueName: \"kubernetes.io/projected/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kube-api-access-sc6bv\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.531171 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.531096 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:32.859507 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:32.859437 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562"] Apr 23 14:34:32.861906 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:34:32.861869 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f85c2a9_5ec8_42b1_8d3b_bcf549539786.slice/crio-3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6 WatchSource:0}: Error finding container 3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6: Status 404 returned error can't find the container with id 3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6 Apr 23 14:34:33.429923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:33.429892 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" event={"ID":"9f85c2a9-5ec8-42b1-8d3b-bcf549539786","Type":"ContainerStarted","Data":"a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77"} Apr 23 14:34:33.430286 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:33.429929 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" event={"ID":"9f85c2a9-5ec8-42b1-8d3b-bcf549539786","Type":"ContainerStarted","Data":"3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6"} Apr 23 14:34:35.196198 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.196172 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:34:35.241909 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.241881 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"f959c725-7c03-44e1-9a9b-52f56f53fc17\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " Apr 23 14:34:35.242084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.241929 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f959c725-7c03-44e1-9a9b-52f56f53fc17-kserve-provision-location\") pod \"f959c725-7c03-44e1-9a9b-52f56f53fc17\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " Apr 23 14:34:35.242084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.241969 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls\") pod \"f959c725-7c03-44e1-9a9b-52f56f53fc17\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " Apr 23 14:34:35.242084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.241986 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-cabundle-cert\") pod \"f959c725-7c03-44e1-9a9b-52f56f53fc17\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " Apr 23 14:34:35.242084 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.242039 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx2wv\" (UniqueName: \"kubernetes.io/projected/f959c725-7c03-44e1-9a9b-52f56f53fc17-kube-api-access-wx2wv\") pod \"f959c725-7c03-44e1-9a9b-52f56f53fc17\" (UID: \"f959c725-7c03-44e1-9a9b-52f56f53fc17\") " Apr 23 14:34:35.242359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.242332 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "f959c725-7c03-44e1-9a9b-52f56f53fc17" (UID: "f959c725-7c03-44e1-9a9b-52f56f53fc17"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:34:35.242359 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.242343 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f959c725-7c03-44e1-9a9b-52f56f53fc17-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f959c725-7c03-44e1-9a9b-52f56f53fc17" (UID: "f959c725-7c03-44e1-9a9b-52f56f53fc17"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:34:35.242469 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.242385 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f959c725-7c03-44e1-9a9b-52f56f53fc17" (UID: "f959c725-7c03-44e1-9a9b-52f56f53fc17"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:34:35.244176 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.244155 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f959c725-7c03-44e1-9a9b-52f56f53fc17" (UID: "f959c725-7c03-44e1-9a9b-52f56f53fc17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:34:35.244274 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.244255 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f959c725-7c03-44e1-9a9b-52f56f53fc17-kube-api-access-wx2wv" (OuterVolumeSpecName: "kube-api-access-wx2wv") pod "f959c725-7c03-44e1-9a9b-52f56f53fc17" (UID: "f959c725-7c03-44e1-9a9b-52f56f53fc17"). InnerVolumeSpecName "kube-api-access-wx2wv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:34:35.342930 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.342845 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wx2wv\" (UniqueName: \"kubernetes.io/projected/f959c725-7c03-44e1-9a9b-52f56f53fc17-kube-api-access-wx2wv\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:35.342930 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.342878 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:35.342930 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.342889 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f959c725-7c03-44e1-9a9b-52f56f53fc17-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:35.342930 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.342899 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f959c725-7c03-44e1-9a9b-52f56f53fc17-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:35.342930 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.342908 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f959c725-7c03-44e1-9a9b-52f56f53fc17-cabundle-cert\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:35.438176 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.438142 2565 generic.go:358] "Generic (PLEG): container finished" podID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerID="e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b" exitCode=0 Apr 23 14:34:35.438353 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.438218 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" Apr 23 14:34:35.438353 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.438232 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerDied","Data":"e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b"} Apr 23 14:34:35.438353 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.438277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t" event={"ID":"f959c725-7c03-44e1-9a9b-52f56f53fc17","Type":"ContainerDied","Data":"9ef9535e3908efb7a4da96d5db20bfd2ad5bcf9354e8f0a2a57f189837405e3d"} Apr 23 14:34:35.438353 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.438299 2565 scope.go:117] "RemoveContainer" containerID="ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac" Apr 23 14:34:35.446685 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.446667 2565 scope.go:117] "RemoveContainer" containerID="e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b" Apr 23 14:34:35.453710 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.453691 2565 scope.go:117] "RemoveContainer" containerID="744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a" Apr 23 14:34:35.460970 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.460950 2565 scope.go:117] "RemoveContainer" containerID="ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac" Apr 23 14:34:35.461220 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:34:35.461201 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac\": container with ID starting with ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac not found: ID does not exist" containerID="ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac" Apr 23 14:34:35.461290 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.461229 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac"} err="failed to get container status \"ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac\": rpc error: code = NotFound desc = could not find container \"ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac\": container with ID starting with ff4e07e3f45d9d555361ed43e09799fbff7c91a06e2329ac6da97b72aa05d1ac not found: ID does not exist" Apr 23 14:34:35.461290 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.461247 2565 scope.go:117] "RemoveContainer" containerID="e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b" Apr 23 14:34:35.461396 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.461362 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t"] Apr 23 14:34:35.461469 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:34:35.461453 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b\": container with ID starting with e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b not found: ID does not exist" containerID="e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b" Apr 23 14:34:35.461508 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.461475 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b"} err="failed to get container status \"e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b\": rpc error: code = NotFound desc = could not find container \"e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b\": container with ID starting with e5c075faa965871a230df91be68983a1d254bbe4b52d9bbd57fc8c7bd3d07e2b not found: ID does not exist" Apr 23 14:34:35.461508 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.461489 2565 scope.go:117] "RemoveContainer" containerID="744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a" Apr 23 14:34:35.461703 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:34:35.461688 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a\": container with ID starting with 744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a not found: ID does not exist" containerID="744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a" Apr 23 14:34:35.461753 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.461707 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a"} err="failed to get container status \"744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a\": rpc error: code = NotFound desc = could not find container \"744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a\": container with ID starting with 744c90a91090df95ba5db05f832e4c459996b912e5d176d3a27f1d9ca650341a not found: ID does not exist" Apr 23 14:34:35.466055 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.466035 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-8498c4f6bc-9x26t"] Apr 23 14:34:35.840447 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:35.840416 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" path="/var/lib/kubelet/pods/f959c725-7c03-44e1-9a9b-52f56f53fc17/volumes" Apr 23 14:34:36.443452 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:36.443418 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/0.log" Apr 23 14:34:36.443943 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:36.443470 2565 generic.go:358] "Generic (PLEG): container finished" podID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerID="a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77" exitCode=1 Apr 23 14:34:36.443943 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:36.443507 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" event={"ID":"9f85c2a9-5ec8-42b1-8d3b-bcf549539786","Type":"ContainerDied","Data":"a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77"} Apr 23 14:34:37.448165 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:37.448137 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/0.log" Apr 23 14:34:37.448583 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:37.448183 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" event={"ID":"9f85c2a9-5ec8-42b1-8d3b-bcf549539786","Type":"ContainerStarted","Data":"ba92f22ecd8ae471a95d671b4310ab0ee73c3d6a584471c37fd169c2859247f2"} Apr 23 14:34:41.462422 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:41.462396 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/1.log" Apr 23 14:34:41.462847 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:41.462732 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/0.log" Apr 23 14:34:41.462847 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:41.462787 2565 generic.go:358] "Generic (PLEG): container finished" podID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerID="ba92f22ecd8ae471a95d671b4310ab0ee73c3d6a584471c37fd169c2859247f2" exitCode=1 Apr 23 14:34:41.462943 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:41.462855 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" event={"ID":"9f85c2a9-5ec8-42b1-8d3b-bcf549539786","Type":"ContainerDied","Data":"ba92f22ecd8ae471a95d671b4310ab0ee73c3d6a584471c37fd169c2859247f2"} Apr 23 14:34:41.462943 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:41.462897 2565 scope.go:117] "RemoveContainer" containerID="a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77" Apr 23 14:34:41.463255 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:41.463238 2565 scope.go:117] "RemoveContainer" containerID="a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77" Apr 23 14:34:41.473359 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:34:41.473330 2565 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_kserve-ci-e2e-test_9f85c2a9-5ec8-42b1-8d3b-bcf549539786_0 in pod sandbox 3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6 from index: no such id: 'a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77'" containerID="a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77" Apr 23 14:34:41.473470 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:34:41.473377 2565 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_kserve-ci-e2e-test_9f85c2a9-5ec8-42b1-8d3b-bcf549539786_0 in pod sandbox 3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6 from index: no such id: 'a4ba58fed5b4e750edd30551a3584c04a99c8ad0404c8b88048c63efa0b65d77'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_kserve-ci-e2e-test(9f85c2a9-5ec8-42b1-8d3b-bcf549539786)\"" logger="UnhandledError" Apr 23 14:34:41.474683 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:34:41.474662 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_kserve-ci-e2e-test(9f85c2a9-5ec8-42b1-8d3b-bcf549539786)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" Apr 23 14:34:42.218993 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.218963 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562"] Apr 23 14:34:42.467420 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.467392 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/1.log" Apr 23 14:34:42.584188 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.584165 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/1.log" Apr 23 14:34:42.584283 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.584225 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:42.703297 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703271 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " Apr 23 14:34:42.703431 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703321 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-proxy-tls\") pod \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " Apr 23 14:34:42.703431 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703349 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc6bv\" (UniqueName: \"kubernetes.io/projected/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kube-api-access-sc6bv\") pod \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " Apr 23 14:34:42.703546 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703517 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kserve-provision-location\") pod \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\" (UID: \"9f85c2a9-5ec8-42b1-8d3b-bcf549539786\") " Apr 23 14:34:42.703649 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703626 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "9f85c2a9-5ec8-42b1-8d3b-bcf549539786" (UID: "9f85c2a9-5ec8-42b1-8d3b-bcf549539786"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:34:42.703796 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703750 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f85c2a9-5ec8-42b1-8d3b-bcf549539786" (UID: "9f85c2a9-5ec8-42b1-8d3b-bcf549539786"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:34:42.703923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703864 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:42.703923 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.703885 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:42.705354 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.705335 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9f85c2a9-5ec8-42b1-8d3b-bcf549539786" (UID: "9f85c2a9-5ec8-42b1-8d3b-bcf549539786"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:34:42.705423 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.705400 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kube-api-access-sc6bv" (OuterVolumeSpecName: "kube-api-access-sc6bv") pod "9f85c2a9-5ec8-42b1-8d3b-bcf549539786" (UID: "9f85c2a9-5ec8-42b1-8d3b-bcf549539786"). InnerVolumeSpecName "kube-api-access-sc6bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:34:42.804666 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.804598 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:42.804666 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:42.804630 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sc6bv\" (UniqueName: \"kubernetes.io/projected/9f85c2a9-5ec8-42b1-8d3b-bcf549539786-kube-api-access-sc6bv\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:34:43.285558 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285524 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9"] Apr 23 14:34:43.285902 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285878 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="storage-initializer" Apr 23 14:34:43.285902 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285894 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="storage-initializer" Apr 23 14:34:43.285902 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285903 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kube-rbac-proxy" Apr 23 14:34:43.285902 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285910 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kube-rbac-proxy" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285926 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerName="storage-initializer" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285932 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerName="storage-initializer" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285940 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerName="storage-initializer" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285945 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerName="storage-initializer" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285957 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.285962 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.286005 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerName="storage-initializer" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.286015 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kserve-container" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.286022 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="f959c725-7c03-44e1-9a9b-52f56f53fc17" containerName="kube-rbac-proxy" Apr 23 14:34:43.286112 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.286117 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" containerName="storage-initializer" Apr 23 14:34:43.288981 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.288965 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.291882 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.291857 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 23 14:34:43.291972 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.291916 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 23 14:34:43.291972 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.291922 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 14:34:43.298399 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.298373 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9"] Apr 23 14:34:43.410098 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.410071 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.410241 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.410103 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6n2g\" (UniqueName: \"kubernetes.io/projected/6e094469-580c-40ae-a3cc-ebef26f79197-kube-api-access-d6n2g\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.410241 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.410138 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e094469-580c-40ae-a3cc-ebef26f79197-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.410321 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.410251 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e094469-580c-40ae-a3cc-ebef26f79197-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.410321 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.410282 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.472188 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.472162 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562_9f85c2a9-5ec8-42b1-8d3b-bcf549539786/storage-initializer/1.log" Apr 23 14:34:43.472517 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.472296 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" event={"ID":"9f85c2a9-5ec8-42b1-8d3b-bcf549539786","Type":"ContainerDied","Data":"3719fab03487a0136c34de547674460fda6cb3920c053d0da10452267b147ca6"} Apr 23 14:34:43.472517 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.472310 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562" Apr 23 14:34:43.472517 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.472339 2565 scope.go:117] "RemoveContainer" containerID="ba92f22ecd8ae471a95d671b4310ab0ee73c3d6a584471c37fd169c2859247f2" Apr 23 14:34:43.510179 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.510153 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562"] Apr 23 14:34:43.510919 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.510902 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e094469-580c-40ae-a3cc-ebef26f79197-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511043 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.510930 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511043 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.510960 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511043 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.510980 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6n2g\" (UniqueName: \"kubernetes.io/projected/6e094469-580c-40ae-a3cc-ebef26f79197-kube-api-access-d6n2g\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511043 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.511015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e094469-580c-40ae-a3cc-ebef26f79197-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511629 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.511607 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511676 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.511662 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e094469-580c-40ae-a3cc-ebef26f79197-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.511718 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.511688 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.513524 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.513507 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e094469-580c-40ae-a3cc-ebef26f79197-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.518087 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.518064 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-599d4bdb65-jg562"] Apr 23 14:34:43.520609 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.520593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6n2g\" (UniqueName: \"kubernetes.io/projected/6e094469-580c-40ae-a3cc-ebef26f79197-kube-api-access-d6n2g\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.599447 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.599394 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:43.717013 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.716966 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9"] Apr 23 14:34:43.719754 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:34:43.719730 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e094469_580c_40ae_a3cc_ebef26f79197.slice/crio-625c9416133db46d4977ff5bd444da48ca2ac1c1d47ecb48313c13a7007dec89 WatchSource:0}: Error finding container 625c9416133db46d4977ff5bd444da48ca2ac1c1d47ecb48313c13a7007dec89: Status 404 returned error can't find the container with id 625c9416133db46d4977ff5bd444da48ca2ac1c1d47ecb48313c13a7007dec89 Apr 23 14:34:43.840468 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:43.840434 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f85c2a9-5ec8-42b1-8d3b-bcf549539786" path="/var/lib/kubelet/pods/9f85c2a9-5ec8-42b1-8d3b-bcf549539786/volumes" Apr 23 14:34:44.476906 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:44.476869 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerStarted","Data":"f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf"} Apr 23 14:34:44.476906 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:44.476911 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerStarted","Data":"625c9416133db46d4977ff5bd444da48ca2ac1c1d47ecb48313c13a7007dec89"} Apr 23 14:34:45.482204 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:45.482167 2565 generic.go:358] "Generic (PLEG): container finished" podID="6e094469-580c-40ae-a3cc-ebef26f79197" containerID="f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf" exitCode=0 Apr 23 14:34:45.482580 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:45.482247 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerDied","Data":"f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf"} Apr 23 14:34:46.487458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:46.487425 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerStarted","Data":"f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969"} Apr 23 14:34:46.487458 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:46.487460 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerStarted","Data":"d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b"} Apr 23 14:34:46.487916 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:46.487534 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:46.512567 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:46.512527 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podStartSLOduration=3.5125156840000002 podStartE2EDuration="3.512515684s" podCreationTimestamp="2026-04-23 14:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:34:46.51067704 +0000 UTC m=+3797.249221006" watchObservedRunningTime="2026-04-23 14:34:46.512515684 +0000 UTC m=+3797.251059650" Apr 23 14:34:47.490475 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:47.490446 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:47.491918 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:47.491892 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:34:48.493426 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:48.493392 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:34:53.497444 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:53.497414 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:34:53.497974 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:34:53.497947 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:35:03.498241 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:35:03.498193 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:35:13.498805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:35:13.498739 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:35:23.498504 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:35:23.498462 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:35:33.498648 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:35:33.498610 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:35:43.498417 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:35:43.498375 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:35:53.498955 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:35:53.498879 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:36:03.370364 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.370329 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9"] Apr 23 14:36:03.370856 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.370825 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" containerID="cri-o://d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b" gracePeriod=30 Apr 23 14:36:03.370922 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.370838 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kube-rbac-proxy" containerID="cri-o://f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969" gracePeriod=30 Apr 23 14:36:03.493924 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.493877 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": dial tcp 10.132.0.49:8643: connect: connection refused" Apr 23 14:36:03.498160 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.498137 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 14:36:03.738780 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.738729 2565 generic.go:358] "Generic (PLEG): container finished" podID="6e094469-580c-40ae-a3cc-ebef26f79197" containerID="f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969" exitCode=2 Apr 23 14:36:03.738959 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:03.738801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerDied","Data":"f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969"} Apr 23 14:36:04.401936 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.401902 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5"] Apr 23 14:36:04.405288 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.405272 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.407805 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.407783 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 23 14:36:04.407915 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.407837 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 23 14:36:04.416533 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.416510 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5"] Apr 23 14:36:04.473417 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.473393 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.473526 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.473482 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx8bn\" (UniqueName: \"kubernetes.io/projected/003b6261-32e1-4f12-b85d-22a5343f66d7-kube-api-access-tx8bn\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.473526 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.473507 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/003b6261-32e1-4f12-b85d-22a5343f66d7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.473615 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.473540 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/003b6261-32e1-4f12-b85d-22a5343f66d7-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.574311 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.574278 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/003b6261-32e1-4f12-b85d-22a5343f66d7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.574469 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.574330 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/003b6261-32e1-4f12-b85d-22a5343f66d7-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.574469 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.574359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.574469 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.574433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx8bn\" (UniqueName: \"kubernetes.io/projected/003b6261-32e1-4f12-b85d-22a5343f66d7-kube-api-access-tx8bn\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.574652 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:04.574591 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 23 14:36:04.574702 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:04.574658 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls podName:003b6261-32e1-4f12-b85d-22a5343f66d7 nodeName:}" failed. No retries permitted until 2026-04-23 14:36:05.074639026 +0000 UTC m=+3875.813182974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" (UID: "003b6261-32e1-4f12-b85d-22a5343f66d7") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 23 14:36:04.574702 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.574690 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/003b6261-32e1-4f12-b85d-22a5343f66d7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.575031 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.575010 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/003b6261-32e1-4f12-b85d-22a5343f66d7-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:04.591188 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:04.591160 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx8bn\" (UniqueName: \"kubernetes.io/projected/003b6261-32e1-4f12-b85d-22a5343f66d7-kube-api-access-tx8bn\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:05.078305 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.078273 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:05.080661 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.080636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:05.316208 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.316178 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:05.441235 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.441203 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5"] Apr 23 14:36:05.444371 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:36:05.444345 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003b6261_32e1_4f12_b85d_22a5343f66d7.slice/crio-cdcd93a2d0ad819180a010c2e0769abf2cf0edb7b98dfc9cc18c0d1dbb342b62 WatchSource:0}: Error finding container cdcd93a2d0ad819180a010c2e0769abf2cf0edb7b98dfc9cc18c0d1dbb342b62: Status 404 returned error can't find the container with id cdcd93a2d0ad819180a010c2e0769abf2cf0edb7b98dfc9cc18c0d1dbb342b62 Apr 23 14:36:05.446712 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.446690 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:36:05.747167 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.747127 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" event={"ID":"003b6261-32e1-4f12-b85d-22a5343f66d7","Type":"ContainerStarted","Data":"b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137"} Apr 23 14:36:05.747167 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:05.747165 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" event={"ID":"003b6261-32e1-4f12-b85d-22a5343f66d7","Type":"ContainerStarted","Data":"cdcd93a2d0ad819180a010c2e0769abf2cf0edb7b98dfc9cc18c0d1dbb342b62"} Apr 23 14:36:07.501385 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.501361 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:36:07.599412 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599334 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e094469-580c-40ae-a3cc-ebef26f79197-kserve-provision-location\") pod \"6e094469-580c-40ae-a3cc-ebef26f79197\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " Apr 23 14:36:07.599412 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599369 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"6e094469-580c-40ae-a3cc-ebef26f79197\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " Apr 23 14:36:07.599412 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599412 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6n2g\" (UniqueName: \"kubernetes.io/projected/6e094469-580c-40ae-a3cc-ebef26f79197-kube-api-access-d6n2g\") pod \"6e094469-580c-40ae-a3cc-ebef26f79197\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " Apr 23 14:36:07.599683 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599441 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-cabundle-cert\") pod \"6e094469-580c-40ae-a3cc-ebef26f79197\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " Apr 23 14:36:07.599683 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599456 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e094469-580c-40ae-a3cc-ebef26f79197-proxy-tls\") pod \"6e094469-580c-40ae-a3cc-ebef26f79197\" (UID: \"6e094469-580c-40ae-a3cc-ebef26f79197\") " Apr 23 14:36:07.599817 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599728 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e094469-580c-40ae-a3cc-ebef26f79197-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e094469-580c-40ae-a3cc-ebef26f79197" (UID: "6e094469-580c-40ae-a3cc-ebef26f79197"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:36:07.599857 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599828 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "6e094469-580c-40ae-a3cc-ebef26f79197" (UID: "6e094469-580c-40ae-a3cc-ebef26f79197"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:36:07.599932 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.599866 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6e094469-580c-40ae-a3cc-ebef26f79197" (UID: "6e094469-580c-40ae-a3cc-ebef26f79197"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:36:07.601599 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.601577 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e094469-580c-40ae-a3cc-ebef26f79197-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6e094469-580c-40ae-a3cc-ebef26f79197" (UID: "6e094469-580c-40ae-a3cc-ebef26f79197"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:36:07.601722 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.601596 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e094469-580c-40ae-a3cc-ebef26f79197-kube-api-access-d6n2g" (OuterVolumeSpecName: "kube-api-access-d6n2g") pod "6e094469-580c-40ae-a3cc-ebef26f79197" (UID: "6e094469-580c-40ae-a3cc-ebef26f79197"). InnerVolumeSpecName "kube-api-access-d6n2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:36:07.700648 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.700618 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e094469-580c-40ae-a3cc-ebef26f79197-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:07.700648 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.700643 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:07.700648 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.700654 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6n2g\" (UniqueName: \"kubernetes.io/projected/6e094469-580c-40ae-a3cc-ebef26f79197-kube-api-access-d6n2g\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:07.700888 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.700664 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e094469-580c-40ae-a3cc-ebef26f79197-cabundle-cert\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:07.700888 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.700674 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e094469-580c-40ae-a3cc-ebef26f79197-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:07.755656 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.755627 2565 generic.go:358] "Generic (PLEG): container finished" podID="6e094469-580c-40ae-a3cc-ebef26f79197" containerID="d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b" exitCode=0 Apr 23 14:36:07.755796 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.755661 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerDied","Data":"d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b"} Apr 23 14:36:07.755796 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.755686 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" event={"ID":"6e094469-580c-40ae-a3cc-ebef26f79197","Type":"ContainerDied","Data":"625c9416133db46d4977ff5bd444da48ca2ac1c1d47ecb48313c13a7007dec89"} Apr 23 14:36:07.755796 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.755702 2565 scope.go:117] "RemoveContainer" containerID="f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969" Apr 23 14:36:07.755796 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.755709 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9" Apr 23 14:36:07.763839 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.763817 2565 scope.go:117] "RemoveContainer" containerID="d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b" Apr 23 14:36:07.770716 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.770697 2565 scope.go:117] "RemoveContainer" containerID="f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf" Apr 23 14:36:07.778587 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.778033 2565 scope.go:117] "RemoveContainer" containerID="f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969" Apr 23 14:36:07.779412 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.779391 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9"] Apr 23 14:36:07.781029 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:07.780899 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969\": container with ID starting with f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969 not found: ID does not exist" containerID="f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969" Apr 23 14:36:07.781029 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.780935 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969"} err="failed to get container status \"f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969\": rpc error: code = NotFound desc = could not find container \"f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969\": container with ID starting with f6b2325b8bf3b5de1a4e07e155a9eb4307750d0a9f1d852996fdbe8fe9024969 not found: ID does not exist" Apr 23 14:36:07.781029 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.780959 2565 scope.go:117] "RemoveContainer" containerID="d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b" Apr 23 14:36:07.781441 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:07.781371 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b\": container with ID starting with d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b not found: ID does not exist" containerID="d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b" Apr 23 14:36:07.781441 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.781403 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b"} err="failed to get container status \"d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b\": rpc error: code = NotFound desc = could not find container \"d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b\": container with ID starting with d88365a4b4d9d550157497f30142efa2c0cb13eeb58a8dc659c2f159e8c0f77b not found: ID does not exist" Apr 23 14:36:07.781441 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.781426 2565 scope.go:117] "RemoveContainer" containerID="f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf" Apr 23 14:36:07.781680 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:07.781656 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf\": container with ID starting with f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf not found: ID does not exist" containerID="f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf" Apr 23 14:36:07.781747 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.781686 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf"} err="failed to get container status \"f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf\": rpc error: code = NotFound desc = could not find container \"f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf\": container with ID starting with f25dcc86399d13f3e4c37016415241d89f77143ccbe59c74f3a92f2dd8993fcf not found: ID does not exist" Apr 23 14:36:07.783938 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.783750 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d9cdb6474-675g9"] Apr 23 14:36:07.840241 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:07.840211 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" path="/var/lib/kubelet/pods/6e094469-580c-40ae-a3cc-ebef26f79197/volumes" Apr 23 14:36:10.769410 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:10.769381 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5_003b6261-32e1-4f12-b85d-22a5343f66d7/storage-initializer/0.log" Apr 23 14:36:10.769853 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:10.769420 2565 generic.go:358] "Generic (PLEG): container finished" podID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerID="b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137" exitCode=1 Apr 23 14:36:10.769853 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:10.769451 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" event={"ID":"003b6261-32e1-4f12-b85d-22a5343f66d7","Type":"ContainerDied","Data":"b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137"} Apr 23 14:36:11.774270 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:11.774242 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5_003b6261-32e1-4f12-b85d-22a5343f66d7/storage-initializer/0.log" Apr 23 14:36:11.774683 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:11.774322 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" event={"ID":"003b6261-32e1-4f12-b85d-22a5343f66d7","Type":"ContainerStarted","Data":"1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b"} Apr 23 14:36:14.415655 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:14.415601 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5"] Apr 23 14:36:14.416212 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:14.415905 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" containerID="cri-o://1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b" gracePeriod=30 Apr 23 14:36:16.463947 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.463924 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5_003b6261-32e1-4f12-b85d-22a5343f66d7/storage-initializer/1.log" Apr 23 14:36:16.464302 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.464287 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5_003b6261-32e1-4f12-b85d-22a5343f66d7/storage-initializer/0.log" Apr 23 14:36:16.464360 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.464350 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:16.570309 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.570228 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx8bn\" (UniqueName: \"kubernetes.io/projected/003b6261-32e1-4f12-b85d-22a5343f66d7-kube-api-access-tx8bn\") pod \"003b6261-32e1-4f12-b85d-22a5343f66d7\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " Apr 23 14:36:16.570309 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.570279 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls\") pod \"003b6261-32e1-4f12-b85d-22a5343f66d7\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " Apr 23 14:36:16.570547 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.570314 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/003b6261-32e1-4f12-b85d-22a5343f66d7-kserve-provision-location\") pod \"003b6261-32e1-4f12-b85d-22a5343f66d7\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " Apr 23 14:36:16.570547 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.570345 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/003b6261-32e1-4f12-b85d-22a5343f66d7-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"003b6261-32e1-4f12-b85d-22a5343f66d7\" (UID: \"003b6261-32e1-4f12-b85d-22a5343f66d7\") " Apr 23 14:36:16.570665 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.570603 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003b6261-32e1-4f12-b85d-22a5343f66d7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "003b6261-32e1-4f12-b85d-22a5343f66d7" (UID: "003b6261-32e1-4f12-b85d-22a5343f66d7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:36:16.570751 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.570731 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003b6261-32e1-4f12-b85d-22a5343f66d7-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "003b6261-32e1-4f12-b85d-22a5343f66d7" (UID: "003b6261-32e1-4f12-b85d-22a5343f66d7"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:36:16.572348 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.572327 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "003b6261-32e1-4f12-b85d-22a5343f66d7" (UID: "003b6261-32e1-4f12-b85d-22a5343f66d7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:36:16.572414 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.572392 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003b6261-32e1-4f12-b85d-22a5343f66d7-kube-api-access-tx8bn" (OuterVolumeSpecName: "kube-api-access-tx8bn") pod "003b6261-32e1-4f12-b85d-22a5343f66d7" (UID: "003b6261-32e1-4f12-b85d-22a5343f66d7"). InnerVolumeSpecName "kube-api-access-tx8bn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:36:16.671823 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.671793 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tx8bn\" (UniqueName: \"kubernetes.io/projected/003b6261-32e1-4f12-b85d-22a5343f66d7-kube-api-access-tx8bn\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:16.671823 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.671819 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003b6261-32e1-4f12-b85d-22a5343f66d7-proxy-tls\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:16.671823 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.671829 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/003b6261-32e1-4f12-b85d-22a5343f66d7-kserve-provision-location\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:16.672010 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.671839 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/003b6261-32e1-4f12-b85d-22a5343f66d7-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-187.ec2.internal\" DevicePath \"\"" Apr 23 14:36:16.795581 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.795551 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5_003b6261-32e1-4f12-b85d-22a5343f66d7/storage-initializer/1.log" Apr 23 14:36:16.795915 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.795900 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5_003b6261-32e1-4f12-b85d-22a5343f66d7/storage-initializer/0.log" Apr 23 14:36:16.796016 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.795934 2565 generic.go:358] "Generic (PLEG): container finished" podID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerID="1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b" exitCode=1 Apr 23 14:36:16.796016 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.795968 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" event={"ID":"003b6261-32e1-4f12-b85d-22a5343f66d7","Type":"ContainerDied","Data":"1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b"} Apr 23 14:36:16.796129 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.796023 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" event={"ID":"003b6261-32e1-4f12-b85d-22a5343f66d7","Type":"ContainerDied","Data":"cdcd93a2d0ad819180a010c2e0769abf2cf0edb7b98dfc9cc18c0d1dbb342b62"} Apr 23 14:36:16.796129 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.796031 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5" Apr 23 14:36:16.796129 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.796044 2565 scope.go:117] "RemoveContainer" containerID="1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b" Apr 23 14:36:16.804083 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.804062 2565 scope.go:117] "RemoveContainer" containerID="b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137" Apr 23 14:36:16.810958 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.810941 2565 scope.go:117] "RemoveContainer" containerID="1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b" Apr 23 14:36:16.811195 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:16.811174 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b\": container with ID starting with 1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b not found: ID does not exist" containerID="1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b" Apr 23 14:36:16.811263 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.811202 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b"} err="failed to get container status \"1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b\": rpc error: code = NotFound desc = could not find container \"1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b\": container with ID starting with 1aea140c7dfcb0439ec0d4b484e16ed5a096ceb62f2d6d21af90206a42f0592b not found: ID does not exist" Apr 23 14:36:16.811263 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.811218 2565 scope.go:117] "RemoveContainer" containerID="b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137" Apr 23 14:36:16.811451 ip-10-0-137-187 kubenswrapper[2565]: E0423 14:36:16.811435 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137\": container with ID starting with b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137 not found: ID does not exist" containerID="b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137" Apr 23 14:36:16.811498 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.811454 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137"} err="failed to get container status \"b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137\": rpc error: code = NotFound desc = could not find container \"b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137\": container with ID starting with b88e175c0d5f6f820644b71bce46ab6365fc1bc2acceca2b033242f370185137 not found: ID does not exist" Apr 23 14:36:16.837648 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.837588 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5"] Apr 23 14:36:16.841661 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:16.841640 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-779d8c7cfb-4qdl5"] Apr 23 14:36:17.839501 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:17.839466 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" path="/var/lib/kubelet/pods/003b6261-32e1-4f12-b85d-22a5343f66d7/volumes" Apr 23 14:36:30.013678 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:30.013649 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:36:30.021629 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:30.021605 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:36:47.959804 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:47.959754 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ptldl_1344ba49-27f1-41a6-94d2-2e85595b528d/global-pull-secret-syncer/0.log" Apr 23 14:36:48.011710 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:48.011680 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6fv8j_bf4eb16e-4919-47aa-9bb2-0f615778f26d/konnectivity-agent/0.log" Apr 23 14:36:48.120143 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:48.120114 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-187.ec2.internal_43d3b8ea7119a14ffb4ca124c24a14eb/haproxy/0.log" Apr 23 14:36:51.799281 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:51.799255 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-txxls_b3e8f6c3-e685-4e07-abe9-e57a6f11b37a/cluster-monitoring-operator/0.log" Apr 23 14:36:52.141536 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:52.141460 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xvwsk_8a7e8903-b596-4d37-8bf0-b654a520433b/node-exporter/0.log" Apr 23 14:36:52.161145 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:52.161120 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xvwsk_8a7e8903-b596-4d37-8bf0-b654a520433b/kube-rbac-proxy/0.log" Apr 23 14:36:52.182396 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:52.182373 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xvwsk_8a7e8903-b596-4d37-8bf0-b654a520433b/init-textfile/0.log" Apr 23 14:36:53.930946 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:53.930919 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-knl89_cd536203-7ab7-44ff-86aa-4b70ff820188/networking-console-plugin/0.log" Apr 23 14:36:54.340646 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.340574 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/1.log" Apr 23 14:36:54.346545 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.346524 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtfn8_34a5e8b5-8ca7-40e3-978f-439d854e09b0/console-operator/2.log" Apr 23 14:36:54.931430 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931398 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb"] Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931706 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="storage-initializer" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931716 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="storage-initializer" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931726 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kube-rbac-proxy" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931731 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kube-rbac-proxy" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931746 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931751 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931777 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931782 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931788 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931793 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" Apr 23 14:36:54.931828 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931835 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kserve-container" Apr 23 14:36:54.932172 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931845 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" Apr 23 14:36:54.932172 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931851 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="003b6261-32e1-4f12-b85d-22a5343f66d7" containerName="storage-initializer" Apr 23 14:36:54.932172 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.931860 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e094469-580c-40ae-a3cc-ebef26f79197" containerName="kube-rbac-proxy" Apr 23 14:36:54.934676 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.934660 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:54.938106 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.938080 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-66tjc\"/\"default-dockercfg-6hdlv\"" Apr 23 14:36:54.938237 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.938172 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-66tjc\"/\"kube-root-ca.crt\"" Apr 23 14:36:54.939446 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.939427 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-66tjc\"/\"openshift-service-ca.crt\"" Apr 23 14:36:54.944753 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:54.944731 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb"] Apr 23 14:36:55.078303 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.078274 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l4sv\" (UniqueName: \"kubernetes.io/projected/ce895168-bf35-42d2-8527-0c02732671d6-kube-api-access-5l4sv\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.078478 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.078330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-lib-modules\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.078478 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.078376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-proc\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.078478 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.078422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-sys\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.078478 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.078446 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-podres\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179659 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179630 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5l4sv\" (UniqueName: \"kubernetes.io/projected/ce895168-bf35-42d2-8527-0c02732671d6-kube-api-access-5l4sv\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179803 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179680 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-lib-modules\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179803 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179711 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-proc\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179803 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179778 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-sys\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179916 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179812 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-podres\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179916 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179860 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-proc\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.179916 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179887 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-lib-modules\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.180027 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-sys\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.180027 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.179938 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce895168-bf35-42d2-8527-0c02732671d6-podres\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.188592 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.188540 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l4sv\" (UniqueName: \"kubernetes.io/projected/ce895168-bf35-42d2-8527-0c02732671d6-kube-api-access-5l4sv\") pod \"perf-node-gather-daemonset-fwncb\" (UID: \"ce895168-bf35-42d2-8527-0c02732671d6\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.205045 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.205025 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-4sh6s_aebae10b-0ca7-46fd-860e-f45c0c031024/volume-data-source-validator/0.log" Apr 23 14:36:55.244549 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.244529 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.370108 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.370081 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb"] Apr 23 14:36:55.371827 ip-10-0-137-187 kubenswrapper[2565]: W0423 14:36:55.371795 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podce895168_bf35_42d2_8527_0c02732671d6.slice/crio-bde83b731e0a179ce86c4e5543dbfb735c31fdc835155bb066ef67f37cc9842e WatchSource:0}: Error finding container bde83b731e0a179ce86c4e5543dbfb735c31fdc835155bb066ef67f37cc9842e: Status 404 returned error can't find the container with id bde83b731e0a179ce86c4e5543dbfb735c31fdc835155bb066ef67f37cc9842e Apr 23 14:36:55.920586 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.920552 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" event={"ID":"ce895168-bf35-42d2-8527-0c02732671d6","Type":"ContainerStarted","Data":"58e62784fa8387ef2888d29fd71159c11886878b658f85b65838f0c18f1c5269"} Apr 23 14:36:55.920586 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.920583 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" event={"ID":"ce895168-bf35-42d2-8527-0c02732671d6","Type":"ContainerStarted","Data":"bde83b731e0a179ce86c4e5543dbfb735c31fdc835155bb066ef67f37cc9842e"} Apr 23 14:36:55.920886 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.920659 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:36:55.936178 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.936124 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" podStartSLOduration=1.936110851 podStartE2EDuration="1.936110851s" podCreationTimestamp="2026-04-23 14:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:36:55.935890239 +0000 UTC m=+3926.674434206" watchObservedRunningTime="2026-04-23 14:36:55.936110851 +0000 UTC m=+3926.674654818" Apr 23 14:36:55.990659 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:55.990632 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vf74v_5880bb15-7341-40f4-a23b-983d2d71912f/dns/0.log" Apr 23 14:36:56.012819 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:56.012788 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vf74v_5880bb15-7341-40f4-a23b-983d2d71912f/kube-rbac-proxy/0.log" Apr 23 14:36:56.085604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:56.085582 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5s22_4976cf12-11ff-427a-a58d-9f126da4f625/dns-node-resolver/0.log" Apr 23 14:36:56.497201 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:56.497171 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-78d5f7b556-pkxnd_3d29bc7d-2252-485d-b149-18b806d21365/registry/0.log" Apr 23 14:36:56.566604 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:56.566572 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rjv7k_3a6f5afc-ae97-4be4-ad1c-c3af1a35a586/node-ca/0.log" Apr 23 14:36:57.266372 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:57.266309 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b994fd948-vpkgz_20030382-369a-4a7a-bdb3-477e6d873b00/router/0.log" Apr 23 14:36:57.580198 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:57.580116 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fzqps_9a154f5a-c08f-4f54-b3d7-fea632c012c6/serve-healthcheck-canary/0.log" Apr 23 14:36:57.960987 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:57.960964 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-c6hg5_334930fe-79d2-4d7d-9fd2-1c2db1eaf771/insights-operator/0.log" Apr 23 14:36:57.962117 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:57.962097 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-c6hg5_334930fe-79d2-4d7d-9fd2-1c2db1eaf771/insights-operator/1.log" Apr 23 14:36:57.979077 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:57.979055 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9s6t5_3223e65f-a60d-40dc-895a-90af469a9129/kube-rbac-proxy/0.log" Apr 23 14:36:57.999457 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:57.999436 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9s6t5_3223e65f-a60d-40dc-895a-90af469a9129/exporter/0.log" Apr 23 14:36:58.020314 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:36:58.020295 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9s6t5_3223e65f-a60d-40dc-895a-90af469a9129/extractor/0.log" Apr 23 14:37:00.231995 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:00.231938 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-6ks5v_59388ce3-3fdf-4929-aa70-04dc029a00e1/manager/0.log" Apr 23 14:37:00.799885 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:00.799857 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-5jtvj_f49064d1-2187-4ca9-9d04-0004a1321d9a/seaweedfs-tls-custom/0.log" Apr 23 14:37:01.933570 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:01.933541 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-fwncb" Apr 23 14:37:05.014578 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:05.014551 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fmvmb_8ca1b051-dd5d-4c97-9809-95d139a9d692/migrator/0.log" Apr 23 14:37:05.034339 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:05.034312 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fmvmb_8ca1b051-dd5d-4c97-9809-95d139a9d692/graceful-termination/0.log" Apr 23 14:37:05.351784 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:05.351687 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7xk4g_512b3fcf-e8c1-4eb7-b755-9d8efa3083a5/kube-storage-version-migrator-operator/1.log" Apr 23 14:37:05.352516 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:05.352486 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7xk4g_512b3fcf-e8c1-4eb7-b755-9d8efa3083a5/kube-storage-version-migrator-operator/0.log" Apr 23 14:37:06.430344 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.430319 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/kube-multus-additional-cni-plugins/0.log" Apr 23 14:37:06.455041 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.455017 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/egress-router-binary-copy/0.log" Apr 23 14:37:06.479691 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.479643 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/cni-plugins/0.log" Apr 23 14:37:06.502012 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.501992 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/bond-cni-plugin/0.log" Apr 23 14:37:06.526177 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.526159 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/routeoverride-cni/0.log" Apr 23 14:37:06.545574 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.545557 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/whereabouts-cni-bincopy/0.log" Apr 23 14:37:06.564416 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.564393 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvwgw_967ed5b3-0337-40d9-872d-aa7a02b7c552/whereabouts-cni/0.log" Apr 23 14:37:06.840976 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.840887 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b6gjz_1c553f28-0c89-4983-b30b-c0bdd06b63e6/kube-multus/0.log" Apr 23 14:37:06.963432 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.963405 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gdstf_6d6b50d4-32de-4031-b4e3-a88d3ce08d4d/network-metrics-daemon/0.log" Apr 23 14:37:06.985721 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:06.985682 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gdstf_6d6b50d4-32de-4031-b4e3-a88d3ce08d4d/kube-rbac-proxy/0.log" Apr 23 14:37:08.400414 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.400369 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/ovn-controller/0.log" Apr 23 14:37:08.450174 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.450149 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/ovn-acl-logging/0.log" Apr 23 14:37:08.469541 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.469493 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/kube-rbac-proxy-node/0.log" Apr 23 14:37:08.489447 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.489426 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:37:08.508108 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.508088 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/northd/0.log" Apr 23 14:37:08.529659 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.529643 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/nbdb/0.log" Apr 23 14:37:08.554171 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.554153 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/sbdb/0.log" Apr 23 14:37:08.651288 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:08.651195 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vxhp2_a47ff253-1704-447a-b1cd-4a1b12019c92/ovnkube-controller/0.log" Apr 23 14:37:09.636105 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:09.636069 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-d6r6p_1dffdca5-c142-4766-a823-9d817e2c5ef5/check-endpoints/0.log" Apr 23 14:37:09.677687 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:09.677666 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hclwj_d6413ec2-e315-417e-9b7d-ce057e4f10a3/network-check-target-container/0.log" Apr 23 14:37:10.580785 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:10.580737 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-97wbq_676c8632-4468-4e42-b6fb-2a866baddda7/iptables-alerter/0.log" Apr 23 14:37:11.271993 ip-10-0-137-187 kubenswrapper[2565]: I0423 14:37:11.271968 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-b789s_d604becf-afb4-4b3f-aaec-3618178f4dfe/tuned/0.log"