Apr 17 16:28:35.185662 ip-10-0-138-137 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:28:35.185674 ip-10-0-138-137 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:28:35.185681 ip-10-0-138-137 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:28:35.185913 ip-10-0-138-137 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:28:45.214935 ip-10-0-138-137 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:28:45.214954 ip-10-0-138-137 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 05de3b23c4d5482fba4463a083d1a49c -- Apr 17 16:31:06.524982 ip-10-0-138-137 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:06.995940 ip-10-0-138-137 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:06.995940 ip-10-0-138-137 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:06.995940 ip-10-0-138-137 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:06.995940 ip-10-0-138-137 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:06.995940 ip-10-0-138-137 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:06.997756 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:06.997666 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:07.000003 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:06.999985 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:07.000003 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000003 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000007 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000011 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000014 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000017 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000020 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000023 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000026 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000029 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000032 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000035 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000041 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000045 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000047 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000050 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000053 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000055 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000058 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000061 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000063 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:07.000071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000066 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000068 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000071 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000074 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000076 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000079 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000082 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000085 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000088 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000091 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000093 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000096 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000098 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000101 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000104 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000106 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000109 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000111 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000114 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000116 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:07.000556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000119 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000122 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000124 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000127 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000130 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000133 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000135 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000139 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000141 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000143 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000146 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000148 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000151 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000153 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000156 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000159 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000162 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000164 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000167 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000169 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:07.001069 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000172 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000176 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000179 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000182 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000185 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000188 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000190 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000192 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000195 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000198 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000200 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000203 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000205 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000209 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000212 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000215 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000218 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000220 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000223 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000226 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:07.001713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000228 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000231 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000233 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000236 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000238 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000640 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000660 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000664 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000667 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000671 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000675 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000695 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000699 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000702 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000705 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000708 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000711 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000714 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000717 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:07.002200 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000719 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000722 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000724 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000727 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000730 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000732 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000735 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000737 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000740 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000742 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000746 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000749 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000751 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000754 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000757 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000759 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000762 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000765 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000767 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000770 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:07.002679 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000773 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000775 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000778 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000780 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000785 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000788 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000791 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000794 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000797 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000801 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000804 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000806 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000809 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000812 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000814 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000817 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000819 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000822 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000825 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000827 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:07.003184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000830 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000833 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000836 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000838 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000841 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000844 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000846 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000848 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000851 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000854 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000857 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000860 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000862 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000865 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000867 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000870 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000872 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000874 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000877 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000879 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:07.003707 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000881 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000884 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000886 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000889 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000892 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000894 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000897 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000900 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000902 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000905 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000907 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.000910 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.000985 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.000993 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001000 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001004 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001010 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001014 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001018 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001023 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001026 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:07.004198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001029 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001032 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001036 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001039 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001042 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001045 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001048 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001051 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001054 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001057 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001061 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001064 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001067 2572 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001070 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001074 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001078 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001081 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001085 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001088 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001092 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001095 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001098 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001101 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001104 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001109 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:07.004725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001112 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001115 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001118 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001122 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001125 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001130 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001134 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001137 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001140 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001143 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001147 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001150 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001153 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001156 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001160 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001162 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001166 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001172 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001175 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001178 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001184 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001188 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001192 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001195 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001198 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001201 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:07.005340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001204 2572 flags.go:64] FLAG: --help="false" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001207 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001210 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001213 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001216 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001219 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001223 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001226 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001229 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001231 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001235 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001238 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001241 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001244 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001247 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001250 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001253 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001255 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001258 2572 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001261 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001264 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001267 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001272 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:07.005995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001276 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001279 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001282 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001287 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001290 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001293 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001296 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001301 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001304 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001309 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001312 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001315 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001318 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001321 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001324 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001327 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001330 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001339 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001342 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001345 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001348 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001351 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001358 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001361 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:07.006591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001364 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001368 2572 flags.go:64] FLAG: --port="10250" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001371 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001373 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dd5ddcee5549b4f8" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001376 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001380 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001383 2572 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001385 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001389 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001400 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001403 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001406 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001410 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001414 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001417 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001420 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001423 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001426 2572 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001429 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001432 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001435 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001438 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001441 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001444 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001447 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001450 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:07.007198 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001453 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001456 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001458 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001462 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001465 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001467 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001470 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001476 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001478 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001481 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001485 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001488 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001491 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001494 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001499 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001502 2572 flags.go:64] FLAG: --v="2" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001507 2572 flags.go:64] FLAG: --version="false" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001511 2572 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001516 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.001519 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001610 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001614 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001617 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001620 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:07.007849 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001624 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001627 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001629 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001632 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001635 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001637 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001640 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001642 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001660 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001663 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001666 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001668 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001672 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001674 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001677 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001680 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001683 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001685 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001688 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:07.008681 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001691 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001695 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001698 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001703 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001706 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001708 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001713 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001718 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001721 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001724 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001727 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001730 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001733 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001735 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001738 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001741 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001743 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001746 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001748 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001751 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:07.009320 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001754 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001757 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001759 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001762 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001764 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001767 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001770 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001772 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001775 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001777 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001780 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001784 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001788 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001791 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001793 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001798 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001800 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001803 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001805 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:07.009854 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001810 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001812 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001815 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001818 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001820 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001823 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001825 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001829 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001832 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001834 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001837 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001839 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001842 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001845 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001847 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001850 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001852 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001855 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001857 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:07.010365 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001860 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001863 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001866 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001868 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.001871 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.002434 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.009973 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.010093 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010163 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010169 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010172 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010175 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010178 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010182 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010184 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010187 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:07.010864 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010190 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010192 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010195 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010197 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010200 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010202 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010205 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010208 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010210 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010213 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010216 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010218 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010221 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010224 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010227 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010229 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010232 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010235 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010237 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010240 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:07.011264 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010242 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010245 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010247 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010250 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010253 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010256 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010258 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010261 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010264 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010266 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010269 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010271 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010274 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010276 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010279 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010281 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010284 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010286 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010289 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010292 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:07.011781 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010294 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010297 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010299 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010302 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010304 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010307 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010310 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010312 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010315 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010317 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010321 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010326 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010329 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010332 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010334 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010339 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010343 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010346 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010349 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:07.012282 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010352 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010354 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010357 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010360 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010363 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010365 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010368 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010371 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010373 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010377 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010380 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010383 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010386 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010389 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010391 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010394 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010396 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010399 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:07.012779 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010402 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.010407 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010525 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010530 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010533 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010536 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010539 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010541 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010544 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010546 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010549 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010552 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010555 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010558 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010560 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:07.013236 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010563 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010566 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010568 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010571 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010573 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010576 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010578 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010581 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010583 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010586 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010589 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010591 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010594 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010597 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010600 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010602 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010604 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010608 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010611 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010613 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:07.013616 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010616 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010618 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010621 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010623 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010626 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010628 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010630 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010635 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010639 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010642 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010661 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010666 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010669 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010672 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010675 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010678 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010682 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010685 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010688 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:07.014210 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010691 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010694 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010696 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010699 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010702 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010705 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010707 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010710 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010712 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010715 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010718 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010720 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010723 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010726 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010728 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010731 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010734 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010736 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010739 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010741 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:07.014698 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010744 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010747 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010749 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010751 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010754 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010756 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010759 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010761 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010764 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010766 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010769 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010772 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010774 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:07.010776 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.010781 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:07.015181 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.011865 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:07.015562 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.014227 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:07.015562 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.015335 2572 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:07.015562 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.015429 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:07.015562 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.015471 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:07.039479 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.039450 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:07.044290 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.044267 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:07.071484 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.071459 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:07.079882 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.079836 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:07.080522 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.080503 2572 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:07.081798 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.081781 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:07.085736 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.085713 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 cdbc6604-d6ed-4e44-a1a3-fdf1d8941340:/dev/nvme0n1p4 ffd8df1e-7735-4a19-a28a-c6a47750d72d:/dev/nvme0n1p3] Apr 17 16:31:07.085816 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.085736 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:07.090913 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.090800 2572 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:07.089446074 +0000 UTC m=+0.438075615 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098463 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec256d74a78acc748935a12c381c43b2 SystemUUID:ec256d74-a78a-cc74-8935-a12c381c43b2 BootID:05de3b23-c4d5-482f-ba44-63a083d1a49c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c7:be:e0:ff:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c7:be:e0:ff:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:d8:a3:4d:eb:8c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:07.090913 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.090906 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:07.091021 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.090993 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:07.092756 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.092725 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:07.092904 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.092759 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-137.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:07.092950 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.092915 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:07.092950 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.092924 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:07.092950 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.092936 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:07.093712 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.093701 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:07.094930 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.094919 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:07.095037 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.095028 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:07.097479 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.097469 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:07.097512 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.097482 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:07.097512 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.097494 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:07.097512 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.097503 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:07.097512 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.097511 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:07.098677 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.098664 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:07.098730 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.098684 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:07.101813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.101793 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:07.103126 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.103113 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:07.105526 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105514 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105533 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105539 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105549 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105558 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105568 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105575 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:07.105582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105581 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:07.105798 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105588 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:07.105798 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105594 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:07.105798 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105611 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:07.105798 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.105624 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:07.106526 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.106516 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:07.106526 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.106526 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:07.110143 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.110127 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:07.110236 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.110166 2572 server.go:1295] "Started kubelet" Apr 17 16:31:07.110353 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.110238 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:07.110757 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.110709 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:07.110814 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.110765 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:07.110814 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.110785 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:07.110931 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.110829 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-137.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:07.110979 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.110937 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-137.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:07.111133 ip-10-0-138-137 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:07.112623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.112605 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:07.113642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.113628 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:07.117618 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.117595 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:07.118121 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.118104 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:07.123273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.122430 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:07.123273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.122455 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:07.123273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.122628 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:07.123273 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.122753 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.123273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.122825 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:07.123273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.122833 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:07.123676 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.123639 2572 factory.go:55] Registering systemd factory Apr 17 16:31:07.123733 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.123690 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:07.124105 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124061 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lbkhd" Apr 17 16:31:07.124189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124145 2572 factory.go:153] Registering CRI-O factory Apr 17 16:31:07.124189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124158 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:07.124291 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124214 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:07.124291 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124239 2572 factory.go:103] Registering Raw factory Apr 17 16:31:07.124291 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124255 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:07.124429 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.124318 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:07.124677 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.124644 2572 manager.go:319] Starting recovery of all containers Apr 17 16:31:07.125131 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.118192 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-137.ec2.internal.18a731ebdca9c568 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-137.ec2.internal,UID:ip-10-0-138-137.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-137.ec2.internal,},FirstTimestamp:2026-04-17 16:31:07.11013924 +0000 UTC m=+0.458768779,LastTimestamp:2026-04-17 16:31:07.11013924 +0000 UTC m=+0.458768779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-137.ec2.internal,}" Apr 17 16:31:07.128747 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.128716 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:31:07.128876 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.128850 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-137.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:31:07.132349 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.132327 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lbkhd" Apr 17 16:31:07.135377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.135362 2572 manager.go:324] Recovery completed Apr 17 16:31:07.139493 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.139480 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:07.142061 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.142046 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:07.142124 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.142079 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:07.142124 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.142096 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:07.142700 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.142684 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:07.142700 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.142699 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:07.142806 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.142717 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:07.144173 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.144111 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-137.ec2.internal.18a731ebde90da41 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-137.ec2.internal,UID:ip-10-0-138-137.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-137.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-137.ec2.internal,},FirstTimestamp:2026-04-17 16:31:07.142060609 +0000 UTC m=+0.490690148,LastTimestamp:2026-04-17 16:31:07.142060609 +0000 UTC m=+0.490690148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-137.ec2.internal,}" Apr 17 16:31:07.145850 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.145837 2572 policy_none.go:49] "None policy: Start" Apr 17 16:31:07.145912 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.145854 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:07.145912 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.145864 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183209 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.183246 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183255 2572 server.go:85] "Starting device plugin registration server" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183526 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183537 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183608 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183761 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.183770 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.184641 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:07.202636 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.184695 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.220292 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.220254 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:07.221522 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.221502 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:07.221625 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.221536 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:07.221625 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.221580 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:07.221625 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.221589 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:07.221760 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.221633 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:07.225762 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.225745 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:07.283991 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.283907 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:07.285207 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.285188 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:07.285356 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.285218 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:07.285356 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.285228 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:07.285356 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.285249 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.294309 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.294294 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.294362 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.294317 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-137.ec2.internal\": node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.307939 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.307906 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.322071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.322035 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal"] Apr 17 16:31:07.322176 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.322165 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:07.323348 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.323332 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6824474d16553c4df67090e9feda5837-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal\" (UID: \"6824474d16553c4df67090e9feda5837\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.323408 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.323357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6824474d16553c4df67090e9feda5837-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal\" (UID: \"6824474d16553c4df67090e9feda5837\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.323882 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.323865 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:07.323979 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.323891 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:07.323979 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.323903 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:07.326182 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326171 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:07.326372 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.326408 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326391 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:07.326953 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326934 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:07.326953 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326944 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:07.327104 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326961 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:07.327104 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326965 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:07.327104 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326980 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:07.327104 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.326993 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:07.329192 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.329175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.329255 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.329207 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:07.331094 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.331078 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:07.331175 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.331105 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:07.331175 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.331114 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:07.356662 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.356617 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-137.ec2.internal\" not found" node="ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.361038 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.361017 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-137.ec2.internal\" not found" node="ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.408476 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.408444 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.423511 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.423479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6824474d16553c4df67090e9feda5837-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal\" (UID: \"6824474d16553c4df67090e9feda5837\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.423675 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.423516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6824474d16553c4df67090e9feda5837-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal\" (UID: \"6824474d16553c4df67090e9feda5837\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.423675 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.423576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6824474d16553c4df67090e9feda5837-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal\" (UID: \"6824474d16553c4df67090e9feda5837\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.423675 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.423586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6824474d16553c4df67090e9feda5837-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal\" (UID: \"6824474d16553c4df67090e9feda5837\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.508626 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.508595 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.524082 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.524044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/692377e0184489441f39fe1105bfb2c4-config\") pod \"kube-apiserver-proxy-ip-10-0-138-137.ec2.internal\" (UID: \"692377e0184489441f39fe1105bfb2c4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.609570 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.609505 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.625048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.625015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/692377e0184489441f39fe1105bfb2c4-config\") pod \"kube-apiserver-proxy-ip-10-0-138-137.ec2.internal\" (UID: \"692377e0184489441f39fe1105bfb2c4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.625116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.625079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/692377e0184489441f39fe1105bfb2c4-config\") pod \"kube-apiserver-proxy-ip-10-0-138-137.ec2.internal\" (UID: \"692377e0184489441f39fe1105bfb2c4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.658135 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.658105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.663729 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:07.663711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" Apr 17 16:31:07.710062 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.710023 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.810724 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.810689 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:07.911405 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:07.911311 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:08.012470 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:08.012436 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:08.015605 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.015586 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:08.015776 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.015760 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:08.113356 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:08.113319 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-137.ec2.internal\" not found" Apr 17 16:31:08.118282 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.118257 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:08.134926 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.134889 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:07 +0000 UTC" deadline="2027-12-18 13:36:32.406320673 +0000 UTC" Apr 17 16:31:08.134926 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.134924 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14637h5m24.271401094s" Apr 17 16:31:08.135114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.135076 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:08.159469 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.159440 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-987d8" Apr 17 16:31:08.170622 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.170550 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-987d8" Apr 17 16:31:08.194134 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.194105 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:08.217940 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:08.217773 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod692377e0184489441f39fe1105bfb2c4.slice/crio-699b2099ac92eb0a75b9983b40ce4a0fb88e96ac1ed83e3bb7288e0894109cf2 WatchSource:0}: Error finding container 699b2099ac92eb0a75b9983b40ce4a0fb88e96ac1ed83e3bb7288e0894109cf2: Status 404 returned error can't find the container with id 699b2099ac92eb0a75b9983b40ce4a0fb88e96ac1ed83e3bb7288e0894109cf2 Apr 17 16:31:08.218184 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:08.218165 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6824474d16553c4df67090e9feda5837.slice/crio-4a3b5df26c78cc0c2e989cd9f32da540f380dcb73ed5e4b8900c22b4370ae7e6 WatchSource:0}: Error finding container 4a3b5df26c78cc0c2e989cd9f32da540f380dcb73ed5e4b8900c22b4370ae7e6: Status 404 returned error can't find the container with id 4a3b5df26c78cc0c2e989cd9f32da540f380dcb73ed5e4b8900c22b4370ae7e6 Apr 17 16:31:08.223008 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.222987 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" Apr 17 16:31:08.223119 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.223034 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:08.224906 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.224862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" event={"ID":"692377e0184489441f39fe1105bfb2c4","Type":"ContainerStarted","Data":"699b2099ac92eb0a75b9983b40ce4a0fb88e96ac1ed83e3bb7288e0894109cf2"} Apr 17 16:31:08.225982 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.225965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" event={"ID":"6824474d16553c4df67090e9feda5837","Type":"ContainerStarted","Data":"4a3b5df26c78cc0c2e989cd9f32da540f380dcb73ed5e4b8900c22b4370ae7e6"} Apr 17 16:31:08.238644 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.238623 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:08.240333 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.240319 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" Apr 17 16:31:08.250222 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.250204 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:08.514820 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.514738 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:08.612874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:08.612843 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:09.098846 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.098816 2572 apiserver.go:52] "Watching apiserver" Apr 17 16:31:09.107466 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.107444 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:09.110801 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.110771 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-bbt78","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb","openshift-dns/node-resolver-cbmjr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal","openshift-multus/multus-fkm8h","openshift-network-operator/iptables-alerter-pxs2x","openshift-ovn-kubernetes/ovnkube-node-gjfdq","kube-system/konnectivity-agent-8hctm","kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal","openshift-cluster-node-tuning-operator/tuned-6hdkr","openshift-image-registry/node-ca-2h7tr","openshift-multus/multus-additional-cni-plugins-bdk96","openshift-multus/network-metrics-daemon-w6ttr"] Apr 17 16:31:09.113868 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.113846 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.116472 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.116450 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.116573 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.116472 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.116573 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.116451 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v7ztv\"" Apr 17 16:31:09.117915 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.117890 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.120849 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.120580 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.120948 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.120886 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fcg7b\"" Apr 17 16:31:09.121063 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.120587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.121272 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.121248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:09.121572 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.121554 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.123271 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.122961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.123497 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.123466 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.123590 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.123511 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:09.123661 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.123619 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:09.124404 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.124169 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:09.124404 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.124185 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xhmkt\"" Apr 17 16:31:09.124404 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.124265 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.124779 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.124761 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xwt9g\"" Apr 17 16:31:09.124865 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.124766 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.124937 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.124920 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.125509 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.125485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.125664 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.125628 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.127828 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.127799 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.127921 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.127885 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.128092 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.128072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.128181 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.128116 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.129295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129194 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:09.129295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129191 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:09.129295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:09.129295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129262 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:09.129295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129284 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d6lcz\"" Apr 17 16:31:09.129573 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129219 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.129573 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z8mmd\"" Apr 17 16:31:09.129698 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.129671 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:09.130307 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.130291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:09.130396 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.130341 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:09.130550 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.130512 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:09.130693 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.130670 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-b82wc\"" Apr 17 16:31:09.131134 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.131110 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:09.132568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132549 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-hostroot\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-daemon-config\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d458d\" (UniqueName: \"kubernetes.io/projected/3fa6420e-46d0-4ade-82cb-f5e03e235d26-kube-api-access-d458d\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjvc\" (UniqueName: \"kubernetes.io/projected/9748097b-af4c-40c0-b6c6-261863bca7b4-kube-api-access-bkjvc\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysconfig\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysctl-d\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysctl-conf\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-sys\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b39db4d3-2acf-485f-ae1c-871507b98494-tmp\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.132931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-cni-multus\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.132950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b39db4d3-2acf-485f-ae1c-871507b98494-etc-tuned\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-device-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-system-cni-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-etc-kubernetes\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-cni-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-k8s-cni-cncf-io\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-netns\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-cni-bin\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133202 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-kubelet\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-kubernetes\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-host\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-sys-fs\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9xz7\" (UniqueName: \"kubernetes.io/projected/c0d53844-6ca3-4f97-9404-6cec628fe368-kube-api-access-p9xz7\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3fa6420e-46d0-4ade-82cb-f5e03e235d26-host-slash\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9748097b-af4c-40c0-b6c6-261863bca7b4-tmp-dir\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-var-lib-kubelet\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.133455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-systemd\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-registration-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-etc-selinux\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-os-release\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-conf-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-run\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-lib-modules\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrzz\" (UniqueName: \"kubernetes.io/projected/b39db4d3-2acf-485f-ae1c-871507b98494-kube-api-access-lzrzz\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9748097b-af4c-40c0-b6c6-261863bca7b4-hosts-file\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0d53844-6ca3-4f97-9404-6cec628fe368-cni-binary-copy\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133784 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-socket-dir-parent\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-multus-certs\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3fa6420e-46d0-4ade-82cb-f5e03e235d26-iptables-alerter-script\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-modprobe-d\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133918 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79k77\" (UniqueName: \"kubernetes.io/projected/01d44556-4af6-4271-b348-0e8b2c60961d-kube-api-access-79k77\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.134248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-socket-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.135036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.133987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-cnibin\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.135036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.134581 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:09.135036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.134767 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:09.135036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.134828 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:09.135036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.134957 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fldzd\"" Apr 17 16:31:09.137230 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.137209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.137308 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.137278 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:09.137308 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.137213 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.139457 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.139395 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xxvfp\"" Apr 17 16:31:09.139677 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.139658 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:09.139765 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.139722 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:09.171327 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.171303 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:08 +0000 UTC" deadline="2027-10-05 22:48:58.529628627 +0000 UTC" Apr 17 16:31:09.171327 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.171326 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12870h17m49.358305343s" Apr 17 16:31:09.224060 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.224036 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:09.235168 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.235313 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.235313 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-var-lib-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.235313 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.235313 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-host\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.235313 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b39db4d3-2acf-485f-ae1c-871507b98494-etc-tuned\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-device-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-system-cni-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235362 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-etc-kubernetes\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-kubelet\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-env-overrides\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-device-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-system-cni-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-etc-kubernetes\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq2r8\" (UniqueName: \"kubernetes.io/projected/e1faab54-1fd0-4e7f-8959-5e580bbd833d-kube-api-access-dq2r8\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235527 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-netns\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.235556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-cni-bin\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-kubelet\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-kubelet\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-cni-bin\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235679 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-netns\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-sys-fs\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-os-release\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-systemd\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-sys-fs\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-etc-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-cni-bin\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v989\" (UniqueName: \"kubernetes.io/projected/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-kube-api-access-4v989\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-systemd\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.236001 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.235981 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-registration-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-conf-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3fa6420e-46d0-4ade-82cb-f5e03e235d26-iptables-alerter-script\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-registration-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d458d\" (UniqueName: \"kubernetes.io/projected/3fa6420e-46d0-4ade-82cb-f5e03e235d26-kube-api-access-d458d\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-systemd\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-run\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-conf-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-lib-modules\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9748097b-af4c-40c0-b6c6-261863bca7b4-hosts-file\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0d53844-6ca3-4f97-9404-6cec628fe368-cni-binary-copy\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovnkube-script-lib\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-modprobe-d\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-cnibin\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-daemon-config\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95c0a6c1-6228-4af3-acf9-47e2e061f7bf-agent-certs\") pod \"konnectivity-agent-8hctm\" (UID: \"95c0a6c1-6228-4af3-acf9-47e2e061f7bf\") " pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.236636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-log-socket\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-cni-netd\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-run\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46pbd\" (UniqueName: \"kubernetes.io/projected/000f5549-91dd-4651-b5a0-21769e3982f4-kube-api-access-46pbd\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9748097b-af4c-40c0-b6c6-261863bca7b4-hosts-file\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-lib-modules\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjvc\" (UniqueName: \"kubernetes.io/projected/9748097b-af4c-40c0-b6c6-261863bca7b4-kube-api-access-bkjvc\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236554 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-cnibin\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysconfig\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysctl-d\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-modprobe-d\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-sys\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3fa6420e-46d0-4ade-82cb-f5e03e235d26-iptables-alerter-script\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b39db4d3-2acf-485f-ae1c-871507b98494-tmp\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236699 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysconfig\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cni-binary-copy\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.237453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-sys\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysctl-d\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-systemd-units\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-slash\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-node-log\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-daemon-config\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236948 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0d53844-6ca3-4f97-9404-6cec628fe368-cni-binary-copy\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.236985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-cni-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-cni-dir\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-k8s-cni-cncf-io\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-ovn\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-kubernetes\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-k8s-cni-cncf-io\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-host\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9xz7\" (UniqueName: \"kubernetes.io/projected/c0d53844-6ca3-4f97-9404-6cec628fe368-kube-api-access-p9xz7\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.238304 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-kubernetes\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3fa6420e-46d0-4ade-82cb-f5e03e235d26-host-slash\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95c0a6c1-6228-4af3-acf9-47e2e061f7bf-konnectivity-ca\") pod \"konnectivity-agent-8hctm\" (UID: \"95c0a6c1-6228-4af3-acf9-47e2e061f7bf\") " pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-host\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rhl\" (UniqueName: \"kubernetes.io/projected/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-kube-api-access-l4rhl\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3fa6420e-46d0-4ade-82cb-f5e03e235d26-host-slash\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9748097b-af4c-40c0-b6c6-261863bca7b4-tmp-dir\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-var-lib-kubelet\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-etc-selinux\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-os-release\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-var-lib-kubelet\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzrzz\" (UniqueName: \"kubernetes.io/projected/b39db4d3-2acf-485f-ae1c-871507b98494-kube-api-access-lzrzz\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-etc-selinux\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-socket-dir-parent\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-os-release\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-multus-certs\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-multus-socket-dir-parent\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-system-cni-dir\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-run-multus-certs\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9748097b-af4c-40c0-b6c6-261863bca7b4-tmp-dir\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237545 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cnibin\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovnkube-config\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79k77\" (UniqueName: \"kubernetes.io/projected/01d44556-4af6-4271-b348-0e8b2c60961d-kube-api-access-79k77\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-run-netns\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237639 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovn-node-metrics-cert\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-socket-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-hostroot\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-serviceca\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01d44556-4af6-4271-b348-0e8b2c60961d-socket-dir\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysctl-conf\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-cni-multus\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.239670 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.237989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-hostroot\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.240167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.238046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0d53844-6ca3-4f97-9404-6cec628fe368-host-var-lib-cni-multus\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.240167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.238055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b39db4d3-2acf-485f-ae1c-871507b98494-etc-sysctl-conf\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.240167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.238940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b39db4d3-2acf-485f-ae1c-871507b98494-etc-tuned\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.240167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.239197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b39db4d3-2acf-485f-ae1c-871507b98494-tmp\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.245242 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.245217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9xz7\" (UniqueName: \"kubernetes.io/projected/c0d53844-6ca3-4f97-9404-6cec628fe368-kube-api-access-p9xz7\") pod \"multus-fkm8h\" (UID: \"c0d53844-6ca3-4f97-9404-6cec628fe368\") " pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.245340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.245240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjvc\" (UniqueName: \"kubernetes.io/projected/9748097b-af4c-40c0-b6c6-261863bca7b4-kube-api-access-bkjvc\") pod \"node-resolver-cbmjr\" (UID: \"9748097b-af4c-40c0-b6c6-261863bca7b4\") " pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.245755 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.245733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d458d\" (UniqueName: \"kubernetes.io/projected/3fa6420e-46d0-4ade-82cb-f5e03e235d26-kube-api-access-d458d\") pod \"iptables-alerter-pxs2x\" (UID: \"3fa6420e-46d0-4ade-82cb-f5e03e235d26\") " pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.246452 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.246432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79k77\" (UniqueName: \"kubernetes.io/projected/01d44556-4af6-4271-b348-0e8b2c60961d-kube-api-access-79k77\") pod \"aws-ebs-csi-driver-node-r5hrb\" (UID: \"01d44556-4af6-4271-b348-0e8b2c60961d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.246733 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.246710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzrzz\" (UniqueName: \"kubernetes.io/projected/b39db4d3-2acf-485f-ae1c-871507b98494-kube-api-access-lzrzz\") pod \"tuned-6hdkr\" (UID: \"b39db4d3-2acf-485f-ae1c-871507b98494\") " pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.338855 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-serviceca\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.338855 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-var-lib-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.338990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-host\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-kubelet\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-env-overrides\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-var-lib-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq2r8\" (UniqueName: \"kubernetes.io/projected/e1faab54-1fd0-4e7f-8959-5e580bbd833d-kube-api-access-dq2r8\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-os-release\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.339127 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-systemd\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-etc-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-host\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-cni-bin\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-kubelet\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339185 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-os-release\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v989\" (UniqueName: \"kubernetes.io/projected/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-kube-api-access-4v989\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-etc-openvswitch\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-systemd\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovnkube-script-lib\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-cni-bin\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95c0a6c1-6228-4af3-acf9-47e2e061f7bf-agent-certs\") pod \"konnectivity-agent-8hctm\" (UID: \"95c0a6c1-6228-4af3-acf9-47e2e061f7bf\") " pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-log-socket\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-serviceca\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-log-socket\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.339634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-cni-netd\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46pbd\" (UniqueName: \"kubernetes.io/projected/000f5549-91dd-4651-b5a0-21769e3982f4-kube-api-access-46pbd\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-cni-netd\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cni-binary-copy\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339510 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-env-overrides\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-systemd-units\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-slash\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-node-log\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-slash\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-ovn\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95c0a6c1-6228-4af3-acf9-47e2e061f7bf-konnectivity-ca\") pod \"konnectivity-agent-8hctm\" (UID: \"95c0a6c1-6228-4af3-acf9-47e2e061f7bf\") " pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-node-log\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rhl\" (UniqueName: \"kubernetes.io/projected/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-kube-api-access-l4rhl\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340414 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-run-ovn\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovnkube-script-lib\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-system-cni-dir\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cnibin\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-system-cni-dir\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339914 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovnkube-config\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-run-netns\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovn-node-metrics-cert\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.339980 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.339962 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.340006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-host-run-netns\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.340036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cni-binary-copy\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.340051 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:09.840022254 +0000 UTC m=+3.188651794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.340052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-cnibin\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.340084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1faab54-1fd0-4e7f-8959-5e580bbd833d-systemd-units\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.340993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.340296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95c0a6c1-6228-4af3-acf9-47e2e061f7bf-konnectivity-ca\") pod \"konnectivity-agent-8hctm\" (UID: \"95c0a6c1-6228-4af3-acf9-47e2e061f7bf\") " pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.341576 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.340381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovnkube-config\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.341953 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.341934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95c0a6c1-6228-4af3-acf9-47e2e061f7bf-agent-certs\") pod \"konnectivity-agent-8hctm\" (UID: \"95c0a6c1-6228-4af3-acf9-47e2e061f7bf\") " pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.342105 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.342089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1faab54-1fd0-4e7f-8959-5e580bbd833d-ovn-node-metrics-cert\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.348481 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.348459 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:09.348481 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.348482 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:09.348676 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.348496 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:09.348676 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.348561 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:09.848546811 +0000 UTC m=+3.197176357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:09.350764 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.350674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46pbd\" (UniqueName: \"kubernetes.io/projected/000f5549-91dd-4651-b5a0-21769e3982f4-kube-api-access-46pbd\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.350855 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.350827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v989\" (UniqueName: \"kubernetes.io/projected/1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45-kube-api-access-4v989\") pod \"node-ca-2h7tr\" (UID: \"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45\") " pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.351023 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.351004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq2r8\" (UniqueName: \"kubernetes.io/projected/e1faab54-1fd0-4e7f-8959-5e580bbd833d-kube-api-access-dq2r8\") pod \"ovnkube-node-gjfdq\" (UID: \"e1faab54-1fd0-4e7f-8959-5e580bbd833d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.351134 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.351054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rhl\" (UniqueName: \"kubernetes.io/projected/5cd8687d-ad01-456f-b5f8-9c49b1c2488b-kube-api-access-l4rhl\") pod \"multus-additional-cni-plugins-bdk96\" (UID: \"5cd8687d-ad01-456f-b5f8-9c49b1c2488b\") " pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.427912 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.427874 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" Apr 17 16:31:09.435711 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.435690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" Apr 17 16:31:09.444393 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.444371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fkm8h" Apr 17 16:31:09.448960 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.448936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cbmjr" Apr 17 16:31:09.455440 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.455424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pxs2x" Apr 17 16:31:09.462000 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.461984 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:09.468550 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.468526 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:09.477049 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.477029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2h7tr" Apr 17 16:31:09.482528 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.482510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bdk96" Apr 17 16:31:09.842598 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.842528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:09.842750 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.842680 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:09.842750 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.842740 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:10.842725989 +0000 UTC m=+4.191355520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:09.870570 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.870523 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa6420e_46d0_4ade_82cb_f5e03e235d26.slice/crio-8b90312d807bd548af531bcb3693e4ebb436526fa3a4db8d2a560b54cbb6799b WatchSource:0}: Error finding container 8b90312d807bd548af531bcb3693e4ebb436526fa3a4db8d2a560b54cbb6799b: Status 404 returned error can't find the container with id 8b90312d807bd548af531bcb3693e4ebb436526fa3a4db8d2a560b54cbb6799b Apr 17 16:31:09.872619 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.872586 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0d53844_6ca3_4f97_9404_6cec628fe368.slice/crio-da9cc9157a035a26622a423a3bab9f7378eba8d32e78ff578bac23a9d03b565d WatchSource:0}: Error finding container da9cc9157a035a26622a423a3bab9f7378eba8d32e78ff578bac23a9d03b565d: Status 404 returned error can't find the container with id da9cc9157a035a26622a423a3bab9f7378eba8d32e78ff578bac23a9d03b565d Apr 17 16:31:09.875461 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.875438 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c0a6c1_6228_4af3_acf9_47e2e061f7bf.slice/crio-92042c5f20bb4a7482d06b26047e5aa8a38c5c008a0e3775268816e013699cd1 WatchSource:0}: Error finding container 92042c5f20bb4a7482d06b26047e5aa8a38c5c008a0e3775268816e013699cd1: Status 404 returned error can't find the container with id 92042c5f20bb4a7482d06b26047e5aa8a38c5c008a0e3775268816e013699cd1 Apr 17 16:31:09.876075 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.876049 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39db4d3_2acf_485f_ae1c_871507b98494.slice/crio-89bc5477d7b6d7b472a35cc0360ddb9a561e8565998b0b0685ef69aa75b12791 WatchSource:0}: Error finding container 89bc5477d7b6d7b472a35cc0360ddb9a561e8565998b0b0685ef69aa75b12791: Status 404 returned error can't find the container with id 89bc5477d7b6d7b472a35cc0360ddb9a561e8565998b0b0685ef69aa75b12791 Apr 17 16:31:09.877002 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.876979 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4367fb_4aa5_4cb9_a2b9_b50cbfeedf45.slice/crio-795530d9c465d3a3c5f235f1d987af01cdaac841669094cbc7d055a9940c7a6c WatchSource:0}: Error finding container 795530d9c465d3a3c5f235f1d987af01cdaac841669094cbc7d055a9940c7a6c: Status 404 returned error can't find the container with id 795530d9c465d3a3c5f235f1d987af01cdaac841669094cbc7d055a9940c7a6c Apr 17 16:31:09.880356 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.880335 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1faab54_1fd0_4e7f_8959_5e580bbd833d.slice/crio-2e295cb161f6983ba1b42e9b031a6e7cdca2789032cc013e20078b23c2a379d7 WatchSource:0}: Error finding container 2e295cb161f6983ba1b42e9b031a6e7cdca2789032cc013e20078b23c2a379d7: Status 404 returned error can't find the container with id 2e295cb161f6983ba1b42e9b031a6e7cdca2789032cc013e20078b23c2a379d7 Apr 17 16:31:09.881022 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.881001 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd8687d_ad01_456f_b5f8_9c49b1c2488b.slice/crio-71c709510b746cb0425c64d891c57f89bdcfc1daa7ee99139c344a11c72316d3 WatchSource:0}: Error finding container 71c709510b746cb0425c64d891c57f89bdcfc1daa7ee99139c344a11c72316d3: Status 404 returned error can't find the container with id 71c709510b746cb0425c64d891c57f89bdcfc1daa7ee99139c344a11c72316d3 Apr 17 16:31:09.881556 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.881524 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9748097b_af4c_40c0_b6c6_261863bca7b4.slice/crio-65fc790f9aed415dc2395b974a219ceef8938f6529aabe04bd607b3c5c428752 WatchSource:0}: Error finding container 65fc790f9aed415dc2395b974a219ceef8938f6529aabe04bd607b3c5c428752: Status 404 returned error can't find the container with id 65fc790f9aed415dc2395b974a219ceef8938f6529aabe04bd607b3c5c428752 Apr 17 16:31:09.882403 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:09.882375 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d44556_4af6_4271_b348_0e8b2c60961d.slice/crio-042fe804cd6e0c35cfbff350fff92a1a37ed20634b66845b9caf62028654c4fa WatchSource:0}: Error finding container 042fe804cd6e0c35cfbff350fff92a1a37ed20634b66845b9caf62028654c4fa: Status 404 returned error can't find the container with id 042fe804cd6e0c35cfbff350fff92a1a37ed20634b66845b9caf62028654c4fa Apr 17 16:31:09.943581 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:09.943556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:09.943705 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.943689 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:09.943705 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.943702 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:09.943776 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.943711 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:09.943776 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:09.943756 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:10.943741589 +0000 UTC m=+4.292371116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:10.171805 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.171720 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:08 +0000 UTC" deadline="2027-10-08 20:48:42.818476423 +0000 UTC" Apr 17 16:31:10.171805 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.171753 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12940h17m32.646725608s" Apr 17 16:31:10.229179 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.229146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerStarted","Data":"71c709510b746cb0425c64d891c57f89bdcfc1daa7ee99139c344a11c72316d3"} Apr 17 16:31:10.230016 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.229992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" event={"ID":"01d44556-4af6-4271-b348-0e8b2c60961d","Type":"ContainerStarted","Data":"042fe804cd6e0c35cfbff350fff92a1a37ed20634b66845b9caf62028654c4fa"} Apr 17 16:31:10.233329 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.233290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"2e295cb161f6983ba1b42e9b031a6e7cdca2789032cc013e20078b23c2a379d7"} Apr 17 16:31:10.234194 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.234176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8hctm" event={"ID":"95c0a6c1-6228-4af3-acf9-47e2e061f7bf","Type":"ContainerStarted","Data":"92042c5f20bb4a7482d06b26047e5aa8a38c5c008a0e3775268816e013699cd1"} Apr 17 16:31:10.235289 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.235269 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cbmjr" event={"ID":"9748097b-af4c-40c0-b6c6-261863bca7b4","Type":"ContainerStarted","Data":"65fc790f9aed415dc2395b974a219ceef8938f6529aabe04bd607b3c5c428752"} Apr 17 16:31:10.236219 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.236199 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" event={"ID":"b39db4d3-2acf-485f-ae1c-871507b98494","Type":"ContainerStarted","Data":"89bc5477d7b6d7b472a35cc0360ddb9a561e8565998b0b0685ef69aa75b12791"} Apr 17 16:31:10.237095 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.237077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2h7tr" event={"ID":"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45","Type":"ContainerStarted","Data":"795530d9c465d3a3c5f235f1d987af01cdaac841669094cbc7d055a9940c7a6c"} Apr 17 16:31:10.237965 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.237948 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fkm8h" event={"ID":"c0d53844-6ca3-4f97-9404-6cec628fe368","Type":"ContainerStarted","Data":"da9cc9157a035a26622a423a3bab9f7378eba8d32e78ff578bac23a9d03b565d"} Apr 17 16:31:10.238752 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.238733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pxs2x" event={"ID":"3fa6420e-46d0-4ade-82cb-f5e03e235d26","Type":"ContainerStarted","Data":"8b90312d807bd548af531bcb3693e4ebb436526fa3a4db8d2a560b54cbb6799b"} Apr 17 16:31:10.239988 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.239969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" event={"ID":"692377e0184489441f39fe1105bfb2c4","Type":"ContainerStarted","Data":"c5a01a19915cdd42f10b018858630d6b3f7114bae23aaae88c34526c00e9225d"} Apr 17 16:31:10.254234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.254191 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-137.ec2.internal" podStartSLOduration=2.254178106 podStartE2EDuration="2.254178106s" podCreationTimestamp="2026-04-17 16:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:10.254022634 +0000 UTC m=+3.602652165" watchObservedRunningTime="2026-04-17 16:31:10.254178106 +0000 UTC m=+3.602807654" Apr 17 16:31:10.852457 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.852365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:10.852624 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:10.852540 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:10.852624 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:10.852612 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:12.852594488 +0000 UTC m=+6.201224020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:10.953047 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:10.953005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:10.953247 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:10.953228 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:10.953312 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:10.953255 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:10.953312 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:10.953268 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:10.953406 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:10.953329 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:12.953311113 +0000 UTC m=+6.301940646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:11.224265 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:11.224184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:11.224712 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:11.224311 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:11.224774 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:11.224727 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:11.224844 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:11.224824 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:11.256525 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:11.255726 2572 generic.go:358] "Generic (PLEG): container finished" podID="6824474d16553c4df67090e9feda5837" containerID="649deea6be9ea205ed958d458cd5c642d510f3fd7d97dcc497111d0b3820bbf8" exitCode=0 Apr 17 16:31:11.256525 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:11.255924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" event={"ID":"6824474d16553c4df67090e9feda5837","Type":"ContainerDied","Data":"649deea6be9ea205ed958d458cd5c642d510f3fd7d97dcc497111d0b3820bbf8"} Apr 17 16:31:12.266676 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:12.266261 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" event={"ID":"6824474d16553c4df67090e9feda5837","Type":"ContainerStarted","Data":"18e26ead21e56c404bd412b427ad2afce39367807228e86cf03dd3f5dc76b5b9"} Apr 17 16:31:12.280490 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:12.280438 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-137.ec2.internal" podStartSLOduration=4.280420447 podStartE2EDuration="4.280420447s" podCreationTimestamp="2026-04-17 16:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:12.280234951 +0000 UTC m=+5.628864501" watchObservedRunningTime="2026-04-17 16:31:12.280420447 +0000 UTC m=+5.629049998" Apr 17 16:31:12.871566 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:12.871529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:12.871781 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:12.871706 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:12.871848 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:12.871789 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:16.871769369 +0000 UTC m=+10.220398909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:12.972515 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:12.972476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:12.972721 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:12.972704 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:12.972848 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:12.972728 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:12.972848 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:12.972742 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:12.972848 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:12.972804 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:16.972786081 +0000 UTC m=+10.321415611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:13.223733 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:13.223539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:13.223733 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:13.223685 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:13.225093 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:13.224078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:13.225093 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:13.224183 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:15.222800 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:15.222758 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:15.223444 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:15.222763 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:15.223444 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:15.222894 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:15.223444 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:15.222987 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:16.902791 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:16.902757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:16.903244 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:16.902911 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:16.903244 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:16.902963 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.902950044 +0000 UTC m=+18.251579570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:17.003238 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:17.003203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:17.003393 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:17.003330 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:17.003393 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:17.003351 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:17.003393 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:17.003364 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:17.003493 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:17.003420 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:25.00340506 +0000 UTC m=+18.352034591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:17.223209 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:17.223180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:17.223378 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:17.223272 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:17.223452 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:17.223371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:17.223522 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:17.223501 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:19.221939 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.221764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:19.222666 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.221836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:19.222666 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:19.222014 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:19.222666 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:19.222077 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:19.278877 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.278842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cbmjr" event={"ID":"9748097b-af4c-40c0-b6c6-261863bca7b4","Type":"ContainerStarted","Data":"c77e273e4e8eebf5d23d24255e4f94cbe24d4b1c2f5011557a35249416361993"} Apr 17 16:31:19.280069 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.280037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" event={"ID":"b39db4d3-2acf-485f-ae1c-871507b98494","Type":"ContainerStarted","Data":"ba6739059e2525dd5bb78cbfb6b98067aa5f1deabd2e70d71710bbb26519dd5c"} Apr 17 16:31:19.281308 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.281284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2h7tr" event={"ID":"1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45","Type":"ContainerStarted","Data":"2d37d6115a2afdee3b12f620c7523ecdc9e9ba7b4148a31c50a1e3243c444522"} Apr 17 16:31:19.282745 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.282724 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cd8687d-ad01-456f-b5f8-9c49b1c2488b" containerID="6d2037d314ca9d73c9f4b42a157cecd3ba4c2998d1183eb61b3652ffcd03d0b2" exitCode=0 Apr 17 16:31:19.282831 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.282782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerDied","Data":"6d2037d314ca9d73c9f4b42a157cecd3ba4c2998d1183eb61b3652ffcd03d0b2"} Apr 17 16:31:19.284235 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.284211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" event={"ID":"01d44556-4af6-4271-b348-0e8b2c60961d","Type":"ContainerStarted","Data":"f2ac7c195df9a0745bc3636bf70c0aecb9739b5d568fe75d7a88e920f4a1d7a5"} Apr 17 16:31:19.286223 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.286044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8hctm" event={"ID":"95c0a6c1-6228-4af3-acf9-47e2e061f7bf","Type":"ContainerStarted","Data":"fbb255070d08f32f310eee1d1a998c034c0844a720fd79e327d018739900b846"} Apr 17 16:31:19.292350 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.292299 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cbmjr" podStartSLOduration=3.786715911 podStartE2EDuration="12.29228457s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.905911111 +0000 UTC m=+3.254540643" lastFinishedPulling="2026-04-17 16:31:18.411479767 +0000 UTC m=+11.760109302" observedRunningTime="2026-04-17 16:31:19.291711001 +0000 UTC m=+12.640340550" watchObservedRunningTime="2026-04-17 16:31:19.29228457 +0000 UTC m=+12.640914123" Apr 17 16:31:19.316049 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.316006 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8hctm" podStartSLOduration=3.786677719 podStartE2EDuration="12.315965398s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.877092762 +0000 UTC m=+3.225722292" lastFinishedPulling="2026-04-17 16:31:18.40638043 +0000 UTC m=+11.755009971" observedRunningTime="2026-04-17 16:31:19.304526478 +0000 UTC m=+12.653156029" watchObservedRunningTime="2026-04-17 16:31:19.315965398 +0000 UTC m=+12.664594983" Apr 17 16:31:19.316190 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.316155 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2h7tr" podStartSLOduration=3.810819624 podStartE2EDuration="12.316150776s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.906166617 +0000 UTC m=+3.254796149" lastFinishedPulling="2026-04-17 16:31:18.411497758 +0000 UTC m=+11.760127301" observedRunningTime="2026-04-17 16:31:19.315616047 +0000 UTC m=+12.664245598" watchObservedRunningTime="2026-04-17 16:31:19.316150776 +0000 UTC m=+12.664780357" Apr 17 16:31:19.329942 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.329892 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6hdkr" podStartSLOduration=3.747208138 podStartE2EDuration="12.329877716s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.878117821 +0000 UTC m=+3.226747351" lastFinishedPulling="2026-04-17 16:31:18.460787387 +0000 UTC m=+11.809416929" observedRunningTime="2026-04-17 16:31:19.328768685 +0000 UTC m=+12.677398236" watchObservedRunningTime="2026-04-17 16:31:19.329877716 +0000 UTC m=+12.678507269" Apr 17 16:31:19.438896 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.438818 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:19.439396 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.439372 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:19.567863 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.567828 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wszrw"] Apr 17 16:31:19.589406 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.589384 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.589552 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:19.589456 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:19.625859 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.625559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.625859 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.625626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/385f5d13-97af-4215-9e30-c75e4ad792b1-kubelet-config\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.625859 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.625664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/385f5d13-97af-4215-9e30-c75e4ad792b1-dbus\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.726916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.726885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/385f5d13-97af-4215-9e30-c75e4ad792b1-kubelet-config\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.727095 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.726925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/385f5d13-97af-4215-9e30-c75e4ad792b1-dbus\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.727095 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.726982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.727095 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.727001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/385f5d13-97af-4215-9e30-c75e4ad792b1-kubelet-config\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:19.727095 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:19.727085 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:19.727254 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:19.727138 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret podName:385f5d13-97af-4215-9e30-c75e4ad792b1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.227121639 +0000 UTC m=+13.575751170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret") pod "global-pull-secret-syncer-wszrw" (UID: "385f5d13-97af-4215-9e30-c75e4ad792b1") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:19.727254 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:19.727081 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/385f5d13-97af-4215-9e30-c75e4ad792b1-dbus\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:20.229181 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:20.229145 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:20.229869 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:20.229850 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8hctm" Apr 17 16:31:20.230005 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:20.229950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:20.230106 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:20.230086 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:20.230151 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:20.230142 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret podName:385f5d13-97af-4215-9e30-c75e4ad792b1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.230123818 +0000 UTC m=+14.578753345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret") pod "global-pull-secret-syncer-wszrw" (UID: "385f5d13-97af-4215-9e30-c75e4ad792b1") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:20.290576 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:20.290542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pxs2x" event={"ID":"3fa6420e-46d0-4ade-82cb-f5e03e235d26","Type":"ContainerStarted","Data":"e80ff6ebaaebccd3c6f6148ee4ff7af892cdd83bfe7dbe6ebe2a9dcf71a934d0"} Apr 17 16:31:20.307035 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:20.306974 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pxs2x" podStartSLOduration=4.743314161 podStartE2EDuration="13.306955925s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.873117411 +0000 UTC m=+3.221746941" lastFinishedPulling="2026-04-17 16:31:18.436759172 +0000 UTC m=+11.785388705" observedRunningTime="2026-04-17 16:31:20.30634348 +0000 UTC m=+13.654973030" watchObservedRunningTime="2026-04-17 16:31:20.306955925 +0000 UTC m=+13.655585476" Apr 17 16:31:21.222150 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:21.221930 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:21.222331 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:21.222257 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:21.222331 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:21.222119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:21.222447 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:21.222106 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:21.222502 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:21.222457 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:21.222502 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:21.222363 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:21.237620 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:21.237592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:21.238058 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:21.237735 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:21.238058 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:21.237789 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret podName:385f5d13-97af-4215-9e30-c75e4ad792b1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:23.237772617 +0000 UTC m=+16.586402146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret") pod "global-pull-secret-syncer-wszrw" (UID: "385f5d13-97af-4215-9e30-c75e4ad792b1") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:23.222252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:23.222206 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:23.222252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:23.222249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:23.222795 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:23.222286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:23.222795 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:23.222408 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:23.222795 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:23.222442 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:23.222795 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:23.222579 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:23.250827 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:23.250800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:23.250979 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:23.250918 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:23.250979 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:23.250976 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret podName:385f5d13-97af-4215-9e30-c75e4ad792b1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:27.250958758 +0000 UTC m=+20.599588285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret") pod "global-pull-secret-syncer-wszrw" (UID: "385f5d13-97af-4215-9e30-c75e4ad792b1") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:24.960582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:24.960545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:24.961067 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:24.960730 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:24.961067 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:24.960794 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:40.96077834 +0000 UTC m=+34.309407871 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:25.061785 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:25.061751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:25.061943 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.061902 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:25.061943 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.061918 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:25.061943 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.061927 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.062078 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.061973 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.061959688 +0000 UTC m=+34.410589216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.222212 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:25.222124 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:25.222361 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:25.222128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:25.222361 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.222262 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:25.222445 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.222356 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:25.222445 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:25.222139 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:25.222505 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:25.222457 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:27.223600 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:27.223560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:27.224241 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:27.223693 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:27.224241 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:27.223769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:27.224241 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:27.223813 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:27.224241 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:27.223909 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:27.224241 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:27.223984 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:27.278899 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:27.278860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:27.279092 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:27.279015 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:27.279092 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:27.279085 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret podName:385f5d13-97af-4215-9e30-c75e4ad792b1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:35.279067245 +0000 UTC m=+28.627696775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret") pod "global-pull-secret-syncer-wszrw" (UID: "385f5d13-97af-4215-9e30-c75e4ad792b1") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:29.222093 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:29.222064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:29.222643 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:29.222074 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:29.222643 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:29.222192 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:29.222643 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:29.222294 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:29.222643 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:29.222332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:29.222643 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:29.222402 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:30.601336 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:30.601087 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:31.198610 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.198528 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:30.601277166Z","UUID":"2981d150-8bf4-415c-be19-908e5ada2eeb","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:31.201183 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.201161 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:31.201292 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.201192 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:31.221989 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.221965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:31.221989 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.221979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:31.222188 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.221965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:31.222188 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:31.222063 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:31.222188 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:31.222128 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:31.222355 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:31.222205 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:31.310606 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.310542 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cd8687d-ad01-456f-b5f8-9c49b1c2488b" containerID="3a57d80ccb8441989b7f10c469d9de96e4dec26ed9965e2a03c309094eeef6e6" exitCode=0 Apr 17 16:31:31.310606 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.310590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerDied","Data":"3a57d80ccb8441989b7f10c469d9de96e4dec26ed9965e2a03c309094eeef6e6"} Apr 17 16:31:31.312545 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.312519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" event={"ID":"01d44556-4af6-4271-b348-0e8b2c60961d","Type":"ContainerStarted","Data":"e5787e28cefaea8a30e71765594b2b03211ec24e2d6f166dec17b80cd0f01a07"} Apr 17 16:31:31.315539 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.315516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"52454424bb4666ea00ce5c0311c80097e2ee1f93915c24f6b0d97a59468be136"} Apr 17 16:31:31.315662 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.315549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"7edc6f0cde5cdb383a88eeb388cc17e624e9f1a9c5eff10ccc9eaeedcb7c8dab"} Apr 17 16:31:31.315662 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.315561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"2710cae20c627de874da78d72ab4227c6d88854837e507866ed7353b8cbb3bc7"} Apr 17 16:31:31.315662 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.315574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"9585e6f0edbd30d3e546cccf94107d66b1b8c74427c4d7a5437f2417f4c1924f"} Apr 17 16:31:31.315662 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.315586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"4bc14282298b41acf542d1c441968a174e2f29f50a7bb486ca5e2867cd4baab3"} Apr 17 16:31:31.315662 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.315598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"ebffe4b6151b4dd778aba5a54c173cb865ec23b7f2913443e6d3bcdd2f09f236"} Apr 17 16:31:31.317007 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.316981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fkm8h" event={"ID":"c0d53844-6ca3-4f97-9404-6cec628fe368","Type":"ContainerStarted","Data":"d81c685339f3f279935cb6b389e295403807c566388a39409a95bd78838020d0"} Apr 17 16:31:31.356667 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:31.356598 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fkm8h" podStartSLOduration=3.798792905 podStartE2EDuration="24.356579348s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.875530404 +0000 UTC m=+3.224159936" lastFinishedPulling="2026-04-17 16:31:30.433316849 +0000 UTC m=+23.781946379" observedRunningTime="2026-04-17 16:31:31.356374068 +0000 UTC m=+24.705003618" watchObservedRunningTime="2026-04-17 16:31:31.356579348 +0000 UTC m=+24.705208898" Apr 17 16:31:32.322786 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:32.322746 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" event={"ID":"01d44556-4af6-4271-b348-0e8b2c60961d","Type":"ContainerStarted","Data":"5d3fa4d906e3142c817c3815ba1060cf5783e5dc28bb158da9e0fe09c2f9e228"} Apr 17 16:31:32.340994 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:32.340950 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5hrb" podStartSLOduration=3.500398963 podStartE2EDuration="25.340931173s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.905792117 +0000 UTC m=+3.254421650" lastFinishedPulling="2026-04-17 16:31:31.746324333 +0000 UTC m=+25.094953860" observedRunningTime="2026-04-17 16:31:32.340612866 +0000 UTC m=+25.689242406" watchObservedRunningTime="2026-04-17 16:31:32.340931173 +0000 UTC m=+25.689560721" Apr 17 16:31:33.222146 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:33.222117 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:33.222295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:33.222115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:33.222295 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:33.222225 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:33.222295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:33.222116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:33.222295 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:33.222284 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:33.222473 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:33.222371 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:33.325962 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:33.325893 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cd8687d-ad01-456f-b5f8-9c49b1c2488b" containerID="555f0ef129455c8e16bed468e7859f783eb2e24705622613b2be5b0fc69a101c" exitCode=0 Apr 17 16:31:33.325962 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:33.325918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerDied","Data":"555f0ef129455c8e16bed468e7859f783eb2e24705622613b2be5b0fc69a101c"} Apr 17 16:31:33.328809 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:33.328788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"a6a245d31b92b348c4f1ea8ce414fa0d8b60caf2963f513b8cf74e67ce9a4bb8"} Apr 17 16:31:35.222389 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.222358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:35.222753 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.222490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:35.222753 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.222513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:35.222753 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:35.222501 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:35.222753 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:35.222599 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:35.222753 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:35.222719 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:35.334786 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.334751 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cd8687d-ad01-456f-b5f8-9c49b1c2488b" containerID="74f04399c791bca883e679d75b928e82850e56df3e4d050bfc5324ee5f01ba1b" exitCode=0 Apr 17 16:31:35.334961 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.334840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerDied","Data":"74f04399c791bca883e679d75b928e82850e56df3e4d050bfc5324ee5f01ba1b"} Apr 17 16:31:35.338180 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.338156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" event={"ID":"e1faab54-1fd0-4e7f-8959-5e580bbd833d","Type":"ContainerStarted","Data":"450e97e009bd35deb80454e995ea91710b021bf89c11f6977a73370616bfe10e"} Apr 17 16:31:35.338578 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.338557 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:35.338695 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.338586 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:35.338695 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.338599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:35.339748 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.339717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:35.339931 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:35.339912 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:35.339994 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:35.339983 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret podName:385f5d13-97af-4215-9e30-c75e4ad792b1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.33996385 +0000 UTC m=+44.688593394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret") pod "global-pull-secret-syncer-wszrw" (UID: "385f5d13-97af-4215-9e30-c75e4ad792b1") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:35.353027 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.353000 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:35.353178 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.353161 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:31:35.417711 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:35.417610 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" podStartSLOduration=7.864986205 podStartE2EDuration="28.417594908s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.905791556 +0000 UTC m=+3.254421096" lastFinishedPulling="2026-04-17 16:31:30.458400264 +0000 UTC m=+23.807029799" observedRunningTime="2026-04-17 16:31:35.417418366 +0000 UTC m=+28.766047933" watchObservedRunningTime="2026-04-17 16:31:35.417594908 +0000 UTC m=+28.766224456" Apr 17 16:31:37.124874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:37.124841 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wszrw"] Apr 17 16:31:37.125412 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:37.124964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:37.125412 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:37.125042 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:37.126492 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:37.126469 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w6ttr"] Apr 17 16:31:37.126613 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:37.126558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:37.126670 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:37.126628 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:37.128886 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:37.128677 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bbt78"] Apr 17 16:31:37.128886 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:37.128769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:37.128886 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:37.128849 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:39.222581 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:39.222554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:39.222942 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:39.222554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:39.222942 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:39.222675 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:39.222942 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:39.222557 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:39.222942 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:39.222787 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:39.222942 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:39.222896 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:40.985876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:40.985634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:40.986339 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:40.985795 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:40.986339 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:40.985987 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:12.985964264 +0000 UTC m=+66.334593792 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:41.086867 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:41.086830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:41.087028 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.087011 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:41.087075 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.087038 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:41.087075 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.087052 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wkthr for pod openshift-network-diagnostics/network-check-target-bbt78: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:41.087160 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.087114 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr podName:4ab33527-9aec-4272-9cb2-4f84af38a336 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:13.087095738 +0000 UTC m=+66.435725281 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkthr" (UniqueName: "kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr") pod "network-check-target-bbt78" (UID: "4ab33527-9aec-4272-9cb2-4f84af38a336") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:41.222668 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:41.222619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:41.222816 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:41.222619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:41.222816 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.222771 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:31:41.222816 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:41.222619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:41.222954 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.222824 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bbt78" podUID="4ab33527-9aec-4272-9cb2-4f84af38a336" Apr 17 16:31:41.222954 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:41.222922 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wszrw" podUID="385f5d13-97af-4215-9e30-c75e4ad792b1" Apr 17 16:31:42.965362 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:42.965340 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-137.ec2.internal" event="NodeReady" Apr 17 16:31:42.965895 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:42.965451 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:42.998700 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:42.998673 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f7fcd7469-99n2d"] Apr 17 16:31:43.021373 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.021344 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f7fcd7469-99n2d"] Apr 17 16:31:43.021373 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.021374 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lg6l4"] Apr 17 16:31:43.021559 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.021494 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.023791 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.023772 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:31:43.023917 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.023789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-zn7wr\"" Apr 17 16:31:43.023960 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.023920 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:31:43.024373 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.024357 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:31:43.033340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.033316 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:31:43.037529 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.037509 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p5rm4"] Apr 17 16:31:43.037683 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.037667 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.039772 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.039755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:43.039884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.039805 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:43.039884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.039824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dbwrt\"" Apr 17 16:31:43.060250 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.060230 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p5rm4"] Apr 17 16:31:43.060330 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.060300 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lg6l4"] Apr 17 16:31:43.060372 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.060349 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.062570 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.062543 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:43.062691 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.062641 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:43.062767 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.062690 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:43.062810 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.062783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ttskh\"" Apr 17 16:31:43.102665 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz556\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-kube-api-access-qz556\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102775 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-trusted-ca\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102775 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-bound-sa-token\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102775 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-ca-trust-extracted\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102871 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-installation-pull-secrets\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102871 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-image-registry-private-configuration\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102871 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.102871 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.102845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-certificates\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204065 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-trusted-ca\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204074 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.204189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-bound-sa-token\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cspk2\" (UniqueName: \"kubernetes.io/projected/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-kube-api-access-cspk2\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.204284 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-ca-trust-extracted\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204284 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-installation-pull-secrets\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204346 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-image-registry-private-configuration\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204346 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b139190-3fc0-4860-9067-f972c93db541-tmp-dir\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.204419 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpsn\" (UniqueName: \"kubernetes.io/projected/5b139190-3fc0-4860-9067-f972c93db541-kube-api-access-swpsn\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.204419 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204538 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-certificates\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204599 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.204533 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:43.204599 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.204552 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:31:43.204599 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.204772 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.204626 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:43.704603385 +0000 UTC m=+37.053232917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:31:43.204772 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz556\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-kube-api-access-qz556\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.204772 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-ca-trust-extracted\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.205006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.204985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b139190-3fc0-4860-9067-f972c93db541-config-volume\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.205065 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.205035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-certificates\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.205458 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.205439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-trusted-ca\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.208454 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.208432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-image-registry-private-configuration\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.208567 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.208432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-installation-pull-secrets\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.219696 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.219620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-bound-sa-token\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.219810 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.219704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz556\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-kube-api-access-qz556\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.222562 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.222542 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:43.222695 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.222570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:31:43.222695 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.222546 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:31:43.225326 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.225311 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:31:43.227465 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.227450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-26v9q\"" Apr 17 16:31:43.227749 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.227737 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:43.228093 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.228071 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:43.228142 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.228113 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:43.228142 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.228137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xlgkd\"" Apr 17 16:31:43.305895 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.305858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cspk2\" (UniqueName: \"kubernetes.io/projected/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-kube-api-access-cspk2\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.306048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.305918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b139190-3fc0-4860-9067-f972c93db541-tmp-dir\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.306048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.305935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swpsn\" (UniqueName: \"kubernetes.io/projected/5b139190-3fc0-4860-9067-f972c93db541-kube-api-access-swpsn\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.306048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.305966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.306048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.305986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b139190-3fc0-4860-9067-f972c93db541-config-volume\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.306048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.306014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.306284 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.306089 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:43.306284 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.306100 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:43.306284 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.306154 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:43.806139468 +0000 UTC m=+37.154768995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:31:43.306284 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.306168 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:43.806162171 +0000 UTC m=+37.154791698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:31:43.306492 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.306302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b139190-3fc0-4860-9067-f972c93db541-tmp-dir\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.306613 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.306594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b139190-3fc0-4860-9067-f972c93db541-config-volume\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.322006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.321985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpsn\" (UniqueName: \"kubernetes.io/projected/5b139190-3fc0-4860-9067-f972c93db541-kube-api-access-swpsn\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.323621 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.323604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cspk2\" (UniqueName: \"kubernetes.io/projected/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-kube-api-access-cspk2\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.356327 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.356303 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cd8687d-ad01-456f-b5f8-9c49b1c2488b" containerID="4684a4213129f9a2e8b68688d7ba0d703431b77a77543bab6c065af5b61051bb" exitCode=0 Apr 17 16:31:43.356442 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.356352 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerDied","Data":"4684a4213129f9a2e8b68688d7ba0d703431b77a77543bab6c065af5b61051bb"} Apr 17 16:31:43.709133 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.709097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:43.709303 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.709240 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:43.709303 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.709261 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:31:43.709386 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.709323 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.70930949 +0000 UTC m=+38.057939020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:31:43.810416 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.810328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:43.810416 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:43.810392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:43.810584 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.810469 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:43.810584 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.810529 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.810514934 +0000 UTC m=+38.159144462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:31:43.810584 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.810528 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:43.810584 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:43.810576 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.81056419 +0000 UTC m=+38.159193716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:31:44.360804 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:44.360771 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cd8687d-ad01-456f-b5f8-9c49b1c2488b" containerID="4c2dd584e6fe43ed3d3e21dbb58c445bfd9f552a70543467587f0000bb3fff2d" exitCode=0 Apr 17 16:31:44.361251 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:44.360829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerDied","Data":"4c2dd584e6fe43ed3d3e21dbb58c445bfd9f552a70543467587f0000bb3fff2d"} Apr 17 16:31:44.718028 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:44.717997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:44.718177 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.718151 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:44.718177 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.718172 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:31:44.718247 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.718229 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:46.718212046 +0000 UTC m=+40.066841573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:31:44.818571 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:44.818545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:44.818734 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:44.818588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:44.818734 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.818714 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:44.818802 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.818774 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:46.81876025 +0000 UTC m=+40.167389787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:31:44.818842 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.818723 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:44.818874 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:44.818855 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:46.818840911 +0000 UTC m=+40.167470463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:31:45.366819 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:45.366782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bdk96" event={"ID":"5cd8687d-ad01-456f-b5f8-9c49b1c2488b","Type":"ContainerStarted","Data":"76873e0de5d427700515b3c35817f13640f1e4daa103478d296e0dcf95bae06a"} Apr 17 16:31:45.397721 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:45.397673 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bdk96" podStartSLOduration=5.86194183 podStartE2EDuration="38.39764197s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:31:09.90582079 +0000 UTC m=+3.254450324" lastFinishedPulling="2026-04-17 16:31:42.441520938 +0000 UTC m=+35.790150464" observedRunningTime="2026-04-17 16:31:45.397374623 +0000 UTC m=+38.746004171" watchObservedRunningTime="2026-04-17 16:31:45.39764197 +0000 UTC m=+38.746271518" Apr 17 16:31:46.732528 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:46.732363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:46.732876 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.732509 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:46.732876 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.732567 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:31:46.732876 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.732617 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.732605086 +0000 UTC m=+44.081234626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:31:46.833718 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:46.833688 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:46.833858 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:46.833738 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:46.833858 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.833816 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:46.833858 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.833819 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:46.833976 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.833862 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.833849712 +0000 UTC m=+44.182479238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:31:46.833976 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:46.833873 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.833868233 +0000 UTC m=+44.182497760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:31:50.763524 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:50.763480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:50.763991 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.763678 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:50.763991 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.763709 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:31:50.763991 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.763779 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.76375834 +0000 UTC m=+52.112387891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:31:50.863970 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:50.863929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:50.864101 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:50.863987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:50.864152 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.864091 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:50.864152 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.864111 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:50.864218 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.864158 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.864140183 +0000 UTC m=+52.212769715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:31:50.864218 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:50.864173 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.864166618 +0000 UTC m=+52.212796144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:31:51.368503 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:51.368462 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:51.371741 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:51.371720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/385f5d13-97af-4215-9e30-c75e4ad792b1-original-pull-secret\") pod \"global-pull-secret-syncer-wszrw\" (UID: \"385f5d13-97af-4215-9e30-c75e4ad792b1\") " pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:51.632526 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:51.632450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wszrw" Apr 17 16:31:51.808243 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:51.808211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wszrw"] Apr 17 16:31:51.817119 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:31:51.817095 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod385f5d13_97af_4215_9e30_c75e4ad792b1.slice/crio-7520bd2e6077c704bdcc06d43ddb7b96d541f5e48ea5a3bdb9d6e344536a43b3 WatchSource:0}: Error finding container 7520bd2e6077c704bdcc06d43ddb7b96d541f5e48ea5a3bdb9d6e344536a43b3: Status 404 returned error can't find the container with id 7520bd2e6077c704bdcc06d43ddb7b96d541f5e48ea5a3bdb9d6e344536a43b3 Apr 17 16:31:52.380574 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:52.380530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wszrw" event={"ID":"385f5d13-97af-4215-9e30-c75e4ad792b1","Type":"ContainerStarted","Data":"7520bd2e6077c704bdcc06d43ddb7b96d541f5e48ea5a3bdb9d6e344536a43b3"} Apr 17 16:31:57.391690 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:57.391627 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wszrw" event={"ID":"385f5d13-97af-4215-9e30-c75e4ad792b1","Type":"ContainerStarted","Data":"e833d139170a7bc6fc8cc33a37004de176ec74127db7bf3165a45e34e417aff6"} Apr 17 16:31:58.821752 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:58.821712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:31:58.822166 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.821854 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:58.822166 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.821878 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:31:58.822166 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.821943 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:14.821927667 +0000 UTC m=+68.170557197 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:31:58.922436 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:58.922402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:31:58.922595 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:31:58.922458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:31:58.922595 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.922550 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:58.922688 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.922604 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:14.922590626 +0000 UTC m=+68.271220152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:31:58.922688 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.922550 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:58.922817 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:31:58.922691 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:14.9226787 +0000 UTC m=+68.271308237 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:32:07.353418 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:07.353388 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gjfdq" Apr 17 16:32:07.381687 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:07.381622 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wszrw" podStartSLOduration=43.838594537 podStartE2EDuration="48.381605719s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:51.818677577 +0000 UTC m=+45.167307114" lastFinishedPulling="2026-04-17 16:31:56.361688757 +0000 UTC m=+49.710318296" observedRunningTime="2026-04-17 16:31:57.406945545 +0000 UTC m=+50.755575094" watchObservedRunningTime="2026-04-17 16:32:07.381605719 +0000 UTC m=+60.730235261" Apr 17 16:32:13.015163 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.015122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:32:13.017606 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.017584 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:13.025412 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:13.025396 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:13.025465 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:13.025455 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:17.025436089 +0000 UTC m=+130.374065620 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : secret "metrics-daemon-secret" not found Apr 17 16:32:13.115607 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.115578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:32:13.118750 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.118735 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:13.128718 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.128702 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:13.140677 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.140639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkthr\" (UniqueName: \"kubernetes.io/projected/4ab33527-9aec-4272-9cb2-4f84af38a336-kube-api-access-wkthr\") pod \"network-check-target-bbt78\" (UID: \"4ab33527-9aec-4272-9cb2-4f84af38a336\") " pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:32:13.240601 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.240575 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-26v9q\"" Apr 17 16:32:13.248719 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.248701 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:32:13.381948 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.381916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bbt78"] Apr 17 16:32:13.385580 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:32:13.385555 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ab33527_9aec_4272_9cb2_4f84af38a336.slice/crio-2bd7c5491015d836ee61905d17b374a57c58b7212c080ce8774289f70fdd43ce WatchSource:0}: Error finding container 2bd7c5491015d836ee61905d17b374a57c58b7212c080ce8774289f70fdd43ce: Status 404 returned error can't find the container with id 2bd7c5491015d836ee61905d17b374a57c58b7212c080ce8774289f70fdd43ce Apr 17 16:32:13.425927 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:13.425896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bbt78" event={"ID":"4ab33527-9aec-4272-9cb2-4f84af38a336","Type":"ContainerStarted","Data":"2bd7c5491015d836ee61905d17b374a57c58b7212c080ce8774289f70fdd43ce"} Apr 17 16:32:14.826049 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:14.825990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:32:14.826439 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.826146 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:14.826439 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.826167 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:32:14.826439 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.826232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:46.826213596 +0000 UTC m=+100.174843127 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:32:14.927245 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:14.927216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:32:14.927437 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:14.927307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:32:14.927437 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.927378 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:14.927437 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.927434 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:14.927595 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.927447 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:46.927432856 +0000 UTC m=+100.276062382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:32:14.927595 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:14.927488 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:46.92747004 +0000 UTC m=+100.276099567 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:32:16.432963 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:16.432872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bbt78" event={"ID":"4ab33527-9aec-4272-9cb2-4f84af38a336","Type":"ContainerStarted","Data":"a804317843a57b25664ade60938eaaec730f66536497f166d9f9d586c9ae65de"} Apr 17 16:32:16.433352 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:16.433002 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:32:46.847210 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:46.847156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:32:46.847611 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.847317 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:46.847611 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.847338 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7fcd7469-99n2d: secret "image-registry-tls" not found Apr 17 16:32:46.847611 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.847419 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls podName:1fc318ea-c2d5-4ad6-ad45-b06f054ca89d nodeName:}" failed. No retries permitted until 2026-04-17 16:33:50.847398263 +0000 UTC m=+164.196027791 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls") pod "image-registry-5f7fcd7469-99n2d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d") : secret "image-registry-tls" not found Apr 17 16:32:46.948243 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:46.948216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:32:46.948367 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:46.948280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:32:46.948422 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.948359 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:46.948422 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.948380 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:46.948497 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.948423 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert podName:bf83b7da-8d7a-47cb-873b-aa2f7b647ff9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:50.948407339 +0000 UTC m=+164.297036866 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert") pod "ingress-canary-p5rm4" (UID: "bf83b7da-8d7a-47cb-873b-aa2f7b647ff9") : secret "canary-serving-cert" not found Apr 17 16:32:46.948497 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:32:46.948436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls podName:5b139190-3fc0-4860-9067-f972c93db541 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:50.948430645 +0000 UTC m=+164.297060172 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls") pod "dns-default-lg6l4" (UID: "5b139190-3fc0-4860-9067-f972c93db541") : secret "dns-default-metrics-tls" not found Apr 17 16:32:47.436418 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:47.436344 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bbt78" Apr 17 16:32:47.456008 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:32:47.455962 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bbt78" podStartSLOduration=97.685268821 podStartE2EDuration="1m40.455949885s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:32:13.387371398 +0000 UTC m=+66.736000929" lastFinishedPulling="2026-04-17 16:32:16.158052462 +0000 UTC m=+69.506681993" observedRunningTime="2026-04-17 16:32:16.448866532 +0000 UTC m=+69.797496081" watchObservedRunningTime="2026-04-17 16:32:47.455949885 +0000 UTC m=+100.804579433" Apr 17 16:33:17.049169 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:17.049116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:33:17.049557 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:17.049242 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:33:17.049557 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:17.049314 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs podName:000f5549-91dd-4651-b5a0-21769e3982f4 nodeName:}" failed. No retries permitted until 2026-04-17 16:35:19.049296573 +0000 UTC m=+252.397926103 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs") pod "network-metrics-daemon-w6ttr" (UID: "000f5549-91dd-4651-b5a0-21769e3982f4") : secret "metrics-daemon-secret" not found Apr 17 16:33:21.872289 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.872256 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-66bdc45668-2hj6k"] Apr 17 16:33:21.875063 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.875037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:21.885435 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885383 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 16:33:21.885435 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885431 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 16:33:21.885633 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885445 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bf8lm\"" Apr 17 16:33:21.885633 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885453 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 16:33:21.885633 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885391 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 16:33:21.885633 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885558 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.885854 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.885775 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.889484 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.889462 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-66bdc45668-2hj6k"] Apr 17 16:33:21.983008 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.982974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-default-certificate\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:21.983008 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.983006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-stats-auth\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:21.983196 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.983046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:21.983196 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.983071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:21.983196 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:21.983144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gcj\" (UniqueName: \"kubernetes.io/projected/2af6331b-785c-4d64-a991-6f486304bebf-kube-api-access-j5gcj\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.084450 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.084416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.084450 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.084450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.084638 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.084503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gcj\" (UniqueName: \"kubernetes.io/projected/2af6331b-785c-4d64-a991-6f486304bebf-kube-api-access-j5gcj\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.084638 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.084532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-default-certificate\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.084638 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.084548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-stats-auth\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.084638 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:22.084603 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:22.584581421 +0000 UTC m=+135.933210966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:22.084799 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:22.084737 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:22.084834 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:22.084808 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:22.584791732 +0000 UTC m=+135.933421275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : secret "router-metrics-certs-default" not found Apr 17 16:33:22.086938 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.086906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-default-certificate\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.086938 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.086928 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-stats-auth\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.094585 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.094564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gcj\" (UniqueName: \"kubernetes.io/projected/2af6331b-785c-4d64-a991-6f486304bebf-kube-api-access-j5gcj\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.588016 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.587981 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.588016 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:22.588013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:22.588253 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:22.588143 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:23.588126451 +0000 UTC m=+136.936755979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:22.588253 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:22.588171 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:22.588253 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:22.588221 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:23.588209462 +0000 UTC m=+136.936838988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : secret "router-metrics-certs-default" not found Apr 17 16:33:23.595692 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:23.595640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:23.595692 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:23.595696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:23.596119 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:23.595817 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:25.595797365 +0000 UTC m=+138.944426906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:23.596119 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:23.595837 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:23.596119 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:23.595915 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:25.595901495 +0000 UTC m=+138.944531036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : secret "router-metrics-certs-default" not found Apr 17 16:33:25.612017 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:25.611985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:25.612393 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:25.612019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:25.612393 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:25.612143 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:25.612393 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:25.612211 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:29.612192501 +0000 UTC m=+142.960822050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : secret "router-metrics-certs-default" not found Apr 17 16:33:25.612393 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:25.612230 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:29.612222204 +0000 UTC m=+142.960851731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:27.881271 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:27.881204 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cbmjr_9748097b-af4c-40c0-b6c6-261863bca7b4/dns-node-resolver/0.log" Apr 17 16:33:28.682253 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.682227 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2h7tr_1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45/node-ca/0.log" Apr 17 16:33:28.802909 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.802880 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-z6qlt"] Apr 17 16:33:28.805624 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.805608 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:28.808477 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.808459 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 16:33:28.808601 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.808484 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 16:33:28.808715 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.808703 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 16:33:28.809371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.809352 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 16:33:28.809457 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.809411 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lcvf9\"" Apr 17 16:33:28.816874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.816853 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-z6qlt"] Apr 17 16:33:28.937239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.937152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2ps\" (UniqueName: \"kubernetes.io/projected/18cda9c2-af58-463d-b1c6-784994f7df2d-kube-api-access-nw2ps\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:28.937239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.937190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18cda9c2-af58-463d-b1c6-784994f7df2d-signing-cabundle\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:28.937239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:28.937221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18cda9c2-af58-463d-b1c6-784994f7df2d-signing-key\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.037732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.037694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18cda9c2-af58-463d-b1c6-784994f7df2d-signing-key\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.037912 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.037833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2ps\" (UniqueName: \"kubernetes.io/projected/18cda9c2-af58-463d-b1c6-784994f7df2d-kube-api-access-nw2ps\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.037912 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.037861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18cda9c2-af58-463d-b1c6-784994f7df2d-signing-cabundle\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.038429 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.038406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18cda9c2-af58-463d-b1c6-784994f7df2d-signing-cabundle\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.040121 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.040102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18cda9c2-af58-463d-b1c6-784994f7df2d-signing-key\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.046455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.046433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2ps\" (UniqueName: \"kubernetes.io/projected/18cda9c2-af58-463d-b1c6-784994f7df2d-kube-api-access-nw2ps\") pod \"service-ca-865cb79987-z6qlt\" (UID: \"18cda9c2-af58-463d-b1c6-784994f7df2d\") " pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.114680 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.114639 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-z6qlt" Apr 17 16:33:29.224600 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.224573 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-z6qlt"] Apr 17 16:33:29.225139 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:33:29.225114 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cda9c2_af58_463d_b1c6_784994f7df2d.slice/crio-6f26b4e5b2220960d09db13dc21d76e5c9c23300d380d98cad647bb25d0f5454 WatchSource:0}: Error finding container 6f26b4e5b2220960d09db13dc21d76e5c9c23300d380d98cad647bb25d0f5454: Status 404 returned error can't find the container with id 6f26b4e5b2220960d09db13dc21d76e5c9c23300d380d98cad647bb25d0f5454 Apr 17 16:33:29.571808 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.571718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-z6qlt" event={"ID":"18cda9c2-af58-463d-b1c6-784994f7df2d","Type":"ContainerStarted","Data":"6f26b4e5b2220960d09db13dc21d76e5c9c23300d380d98cad647bb25d0f5454"} Apr 17 16:33:29.641858 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.641822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:29.641858 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:29.641858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:29.642064 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:29.641954 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:29.642064 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:29.641995 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:37.64197868 +0000 UTC m=+150.990608206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:29.642143 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:29.642062 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:37.642044234 +0000 UTC m=+150.990673765 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : secret "router-metrics-certs-default" not found Apr 17 16:33:32.578466 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:32.578433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-z6qlt" event={"ID":"18cda9c2-af58-463d-b1c6-784994f7df2d","Type":"ContainerStarted","Data":"1ee7b76bc02a23e3693e51d9391ccb0f4b9741a70f335c179379d57c63999ca5"} Apr 17 16:33:32.594111 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:32.594067 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-z6qlt" podStartSLOduration=1.897941865 podStartE2EDuration="4.594053249s" podCreationTimestamp="2026-04-17 16:33:28 +0000 UTC" firstStartedPulling="2026-04-17 16:33:29.226736929 +0000 UTC m=+142.575366456" lastFinishedPulling="2026-04-17 16:33:31.922848306 +0000 UTC m=+145.271477840" observedRunningTime="2026-04-17 16:33:32.593296123 +0000 UTC m=+145.941925675" watchObservedRunningTime="2026-04-17 16:33:32.594053249 +0000 UTC m=+145.942682808" Apr 17 16:33:37.707269 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:37.707231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:37.707269 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:37.707272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:37.707693 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:37.707362 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:37.707693 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:37.707391 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:53.707373874 +0000 UTC m=+167.056003401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:37.707693 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:37.707470 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs podName:2af6331b-785c-4d64-a991-6f486304bebf nodeName:}" failed. No retries permitted until 2026-04-17 16:33:53.707463025 +0000 UTC m=+167.056092551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs") pod "router-default-66bdc45668-2hj6k" (UID: "2af6331b-785c-4d64-a991-6f486304bebf") : secret "router-metrics-certs-default" not found Apr 17 16:33:46.034513 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:46.034451 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" podUID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" Apr 17 16:33:46.045758 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:46.045727 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lg6l4" podUID="5b139190-3fc0-4860-9067-f972c93db541" Apr 17 16:33:46.067983 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:46.067961 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p5rm4" podUID="bf83b7da-8d7a-47cb-873b-aa2f7b647ff9" Apr 17 16:33:46.243535 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:33:46.243486 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-w6ttr" podUID="000f5549-91dd-4651-b5a0-21769e3982f4" Apr 17 16:33:46.607684 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:46.607636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lg6l4" Apr 17 16:33:46.607832 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:46.607636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:33:47.958961 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.958930 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8hx2q"] Apr 17 16:33:47.962005 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.961990 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:47.969913 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.969892 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:33:47.970004 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.969892 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:33:47.971040 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.971022 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:33:47.971130 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.971039 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pr7rd\"" Apr 17 16:33:47.971173 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.971142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:33:47.984534 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.984515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/52f90196-09f6-4839-b183-e5c1731d4d40-data-volume\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:47.984625 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.984540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/52f90196-09f6-4839-b183-e5c1731d4d40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:47.984625 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.984570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhps7\" (UniqueName: \"kubernetes.io/projected/52f90196-09f6-4839-b183-e5c1731d4d40-kube-api-access-mhps7\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:47.984719 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.984661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/52f90196-09f6-4839-b183-e5c1731d4d40-crio-socket\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:47.984719 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.984702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/52f90196-09f6-4839-b183-e5c1731d4d40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:47.991890 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:47.991865 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8hx2q"] Apr 17 16:33:48.085488 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/52f90196-09f6-4839-b183-e5c1731d4d40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.085703 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhps7\" (UniqueName: \"kubernetes.io/projected/52f90196-09f6-4839-b183-e5c1731d4d40-kube-api-access-mhps7\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.085703 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/52f90196-09f6-4839-b183-e5c1731d4d40-crio-socket\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.085703 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/52f90196-09f6-4839-b183-e5c1731d4d40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.085703 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/52f90196-09f6-4839-b183-e5c1731d4d40-data-volume\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.085703 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/52f90196-09f6-4839-b183-e5c1731d4d40-crio-socket\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.085990 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.085953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/52f90196-09f6-4839-b183-e5c1731d4d40-data-volume\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.086228 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.086202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/52f90196-09f6-4839-b183-e5c1731d4d40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.087854 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.087837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/52f90196-09f6-4839-b183-e5c1731d4d40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.112557 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.112530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhps7\" (UniqueName: \"kubernetes.io/projected/52f90196-09f6-4839-b183-e5c1731d4d40-kube-api-access-mhps7\") pod \"insights-runtime-extractor-8hx2q\" (UID: \"52f90196-09f6-4839-b183-e5c1731d4d40\") " pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.270520 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.270439 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8hx2q" Apr 17 16:33:48.403702 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.403671 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8hx2q"] Apr 17 16:33:48.406883 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:33:48.406849 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f90196_09f6_4839_b183_e5c1731d4d40.slice/crio-5d00daf0cf0dc1ceebd2a44579a1541d6966360827fdf695278a0bc29978979e WatchSource:0}: Error finding container 5d00daf0cf0dc1ceebd2a44579a1541d6966360827fdf695278a0bc29978979e: Status 404 returned error can't find the container with id 5d00daf0cf0dc1ceebd2a44579a1541d6966360827fdf695278a0bc29978979e Apr 17 16:33:48.613081 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.612991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8hx2q" event={"ID":"52f90196-09f6-4839-b183-e5c1731d4d40","Type":"ContainerStarted","Data":"1ab12be023cc647dce39d4381d91577f85c4f6d64523b39cc277a037b988da83"} Apr 17 16:33:48.613081 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:48.613034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8hx2q" event={"ID":"52f90196-09f6-4839-b183-e5c1731d4d40","Type":"ContainerStarted","Data":"5d00daf0cf0dc1ceebd2a44579a1541d6966360827fdf695278a0bc29978979e"} Apr 17 16:33:49.616539 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:49.616506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8hx2q" event={"ID":"52f90196-09f6-4839-b183-e5c1731d4d40","Type":"ContainerStarted","Data":"31083a7125e4b68a540a0744ad178d2b88e986fd1f230ce5ad4e243640940f4a"} Apr 17 16:33:50.912612 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:50.909581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:33:50.912612 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:50.912417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"image-registry-5f7fcd7469-99n2d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:33:51.010545 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.010505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:33:51.010735 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.010595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:33:51.012940 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.012917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b139190-3fc0-4860-9067-f972c93db541-metrics-tls\") pod \"dns-default-lg6l4\" (UID: \"5b139190-3fc0-4860-9067-f972c93db541\") " pod="openshift-dns/dns-default-lg6l4" Apr 17 16:33:51.013046 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.012992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf83b7da-8d7a-47cb-873b-aa2f7b647ff9-cert\") pod \"ingress-canary-p5rm4\" (UID: \"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9\") " pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:33:51.111722 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.111687 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ttskh\"" Apr 17 16:33:51.111880 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.111687 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dbwrt\"" Apr 17 16:33:51.118732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.118698 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5rm4" Apr 17 16:33:51.118732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.118720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lg6l4" Apr 17 16:33:51.242399 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.242340 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p5rm4"] Apr 17 16:33:51.244900 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:33:51.244868 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf83b7da_8d7a_47cb_873b_aa2f7b647ff9.slice/crio-90d0c058dc21f89e40e1dea5cb12c741888ed5b8eeabc23a1b8ac5d643c9ec85 WatchSource:0}: Error finding container 90d0c058dc21f89e40e1dea5cb12c741888ed5b8eeabc23a1b8ac5d643c9ec85: Status 404 returned error can't find the container with id 90d0c058dc21f89e40e1dea5cb12c741888ed5b8eeabc23a1b8ac5d643c9ec85 Apr 17 16:33:51.258910 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.258885 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lg6l4"] Apr 17 16:33:51.262215 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:33:51.262187 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b139190_3fc0_4860_9067_f972c93db541.slice/crio-be1e6b0c35647ac5dfd31d441733ed2d14ae13b451bf20754e716bbdbb5217e4 WatchSource:0}: Error finding container be1e6b0c35647ac5dfd31d441733ed2d14ae13b451bf20754e716bbdbb5217e4: Status 404 returned error can't find the container with id be1e6b0c35647ac5dfd31d441733ed2d14ae13b451bf20754e716bbdbb5217e4 Apr 17 16:33:51.623307 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.623266 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8hx2q" event={"ID":"52f90196-09f6-4839-b183-e5c1731d4d40","Type":"ContainerStarted","Data":"46073ec9b0611aebea8d5a30b0bb01c0edfe6e63bf129f730ad24f270576b3fa"} Apr 17 16:33:51.624642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.624613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p5rm4" event={"ID":"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9","Type":"ContainerStarted","Data":"90d0c058dc21f89e40e1dea5cb12c741888ed5b8eeabc23a1b8ac5d643c9ec85"} Apr 17 16:33:51.625853 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.625827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lg6l4" event={"ID":"5b139190-3fc0-4860-9067-f972c93db541","Type":"ContainerStarted","Data":"be1e6b0c35647ac5dfd31d441733ed2d14ae13b451bf20754e716bbdbb5217e4"} Apr 17 16:33:51.645680 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:51.645287 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8hx2q" podStartSLOduration=2.480706783 podStartE2EDuration="4.645268309s" podCreationTimestamp="2026-04-17 16:33:47 +0000 UTC" firstStartedPulling="2026-04-17 16:33:48.459793795 +0000 UTC m=+161.808423321" lastFinishedPulling="2026-04-17 16:33:50.62435532 +0000 UTC m=+163.972984847" observedRunningTime="2026-04-17 16:33:51.643341094 +0000 UTC m=+164.991970644" watchObservedRunningTime="2026-04-17 16:33:51.645268309 +0000 UTC m=+164.993897857" Apr 17 16:33:53.632900 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.632868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p5rm4" event={"ID":"bf83b7da-8d7a-47cb-873b-aa2f7b647ff9","Type":"ContainerStarted","Data":"d6f69ba9721734f16849adfc49b4118a143a7ad5dbb14b44d55277effe722022"} Apr 17 16:33:53.634302 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.634276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lg6l4" event={"ID":"5b139190-3fc0-4860-9067-f972c93db541","Type":"ContainerStarted","Data":"1c5852e7b28c2f944a2aeaa1e9e30cc87a7f36b739a1837925c1e7e01295bfd0"} Apr 17 16:33:53.634406 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.634310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lg6l4" event={"ID":"5b139190-3fc0-4860-9067-f972c93db541","Type":"ContainerStarted","Data":"a76e33b2a9fee7b3ab35547c37b001667616fae21a41911c34ea3d058477f480"} Apr 17 16:33:53.634463 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.634418 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lg6l4" Apr 17 16:33:53.648692 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.648629 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p5rm4" podStartSLOduration=128.879447225 podStartE2EDuration="2m10.648616852s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:33:51.246723079 +0000 UTC m=+164.595352605" lastFinishedPulling="2026-04-17 16:33:53.015892702 +0000 UTC m=+166.364522232" observedRunningTime="2026-04-17 16:33:53.647815635 +0000 UTC m=+166.996445184" watchObservedRunningTime="2026-04-17 16:33:53.648616852 +0000 UTC m=+166.997246401" Apr 17 16:33:53.665055 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.664995 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lg6l4" podStartSLOduration=128.916107955 podStartE2EDuration="2m10.664975677s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:33:51.263928108 +0000 UTC m=+164.612557635" lastFinishedPulling="2026-04-17 16:33:53.012795815 +0000 UTC m=+166.361425357" observedRunningTime="2026-04-17 16:33:53.664977349 +0000 UTC m=+167.013606900" watchObservedRunningTime="2026-04-17 16:33:53.664975677 +0000 UTC m=+167.013605229" Apr 17 16:33:53.732420 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.732389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:53.732571 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.732431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:53.733511 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.733489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2af6331b-785c-4d64-a991-6f486304bebf-service-ca-bundle\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:53.734721 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.734700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af6331b-785c-4d64-a991-6f486304bebf-metrics-certs\") pod \"router-default-66bdc45668-2hj6k\" (UID: \"2af6331b-785c-4d64-a991-6f486304bebf\") " pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:53.984882 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:53.984848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:54.097723 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:54.097690 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-66bdc45668-2hj6k"] Apr 17 16:33:54.100769 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:33:54.100737 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af6331b_785c_4d64_a991_6f486304bebf.slice/crio-bfbd2d36c4c3c3526105bf61ec75ddb3c9bd31badc5629c941b0223cc70dc974 WatchSource:0}: Error finding container bfbd2d36c4c3c3526105bf61ec75ddb3c9bd31badc5629c941b0223cc70dc974: Status 404 returned error can't find the container with id bfbd2d36c4c3c3526105bf61ec75ddb3c9bd31badc5629c941b0223cc70dc974 Apr 17 16:33:54.638804 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:54.638765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-66bdc45668-2hj6k" event={"ID":"2af6331b-785c-4d64-a991-6f486304bebf","Type":"ContainerStarted","Data":"680f0eea58c6ad81c44fc7550cc8c7cfc741e3759fed245687fa63e763d4b4b9"} Apr 17 16:33:54.638804 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:54.638806 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-66bdc45668-2hj6k" event={"ID":"2af6331b-785c-4d64-a991-6f486304bebf","Type":"ContainerStarted","Data":"bfbd2d36c4c3c3526105bf61ec75ddb3c9bd31badc5629c941b0223cc70dc974"} Apr 17 16:33:54.659937 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:54.659891 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-66bdc45668-2hj6k" podStartSLOduration=33.659878439 podStartE2EDuration="33.659878439s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:54.658619627 +0000 UTC m=+168.007249183" watchObservedRunningTime="2026-04-17 16:33:54.659878439 +0000 UTC m=+168.008507988" Apr 17 16:33:54.985032 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:54.984996 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:54.987457 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:54.987433 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:55.641808 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:55.641774 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:55.643063 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:55.643040 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-66bdc45668-2hj6k" Apr 17 16:33:57.223211 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.223176 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:33:57.225884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.225862 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-zn7wr\"" Apr 17 16:33:57.233844 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.233829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:33:57.347128 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.347086 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f7fcd7469-99n2d"] Apr 17 16:33:57.349958 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:33:57.349927 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc318ea_c2d5_4ad6_ad45_b06f054ca89d.slice/crio-2b0f9cf535ccda40ff221ca8f27af329e8c95e2b15d0da838367101e0f549222 WatchSource:0}: Error finding container 2b0f9cf535ccda40ff221ca8f27af329e8c95e2b15d0da838367101e0f549222: Status 404 returned error can't find the container with id 2b0f9cf535ccda40ff221ca8f27af329e8c95e2b15d0da838367101e0f549222 Apr 17 16:33:57.647934 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.647832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" event={"ID":"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d","Type":"ContainerStarted","Data":"6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0"} Apr 17 16:33:57.647934 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.647873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" event={"ID":"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d","Type":"ContainerStarted","Data":"2b0f9cf535ccda40ff221ca8f27af329e8c95e2b15d0da838367101e0f549222"} Apr 17 16:33:57.671380 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:57.671331 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" podStartSLOduration=170.671318786 podStartE2EDuration="2m50.671318786s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:57.67019967 +0000 UTC m=+171.018829219" watchObservedRunningTime="2026-04-17 16:33:57.671318786 +0000 UTC m=+171.019948334" Apr 17 16:33:58.650989 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:33:58.650955 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:34:00.222564 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:00.222533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:34:03.641184 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:03.641150 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lg6l4" Apr 17 16:34:05.114818 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.114769 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm"] Apr 17 16:34:05.120834 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.120809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.123023 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.122994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 16:34:05.123796 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.123775 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:34:05.124115 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.124099 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:34:05.124209 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.124189 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vpnfj\"" Apr 17 16:34:05.124445 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.124428 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:34:05.124547 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.124528 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:34:05.134284 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.134261 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sqfzq"] Apr 17 16:34:05.137681 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.137637 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-j2cfd"] Apr 17 16:34:05.137818 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.137801 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.140906 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.140888 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p22nr\"" Apr 17 16:34:05.141370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.141108 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:34:05.141483 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.141209 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:34:05.141607 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.141267 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:34:05.142246 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.142219 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm"] Apr 17 16:34:05.142364 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.142350 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.144985 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.144964 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 16:34:05.145258 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.145240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:34:05.145910 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.145892 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-pqpkj\"" Apr 17 16:34:05.146242 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.146222 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 16:34:05.156281 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.156256 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-j2cfd"] Apr 17 16:34:05.218038 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-textfile\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218038 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-wtmp\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-root\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pjh\" (UniqueName: \"kubernetes.io/projected/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-kube-api-access-84pjh\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mmx\" (UniqueName: \"kubernetes.io/projected/6a033cfb-a232-46c8-8118-206ada51e43f-kube-api-access-l4mmx\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-sys\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218280 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-tls\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwkfd\" (UniqueName: \"kubernetes.io/projected/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-api-access-qwkfd\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218398 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a033cfb-a232-46c8-8118-206ada51e43f-metrics-client-ca\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-accelerators-collector-config\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.218623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e3d2b560-1dd2-4c10-adba-974388d9af49-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.219090 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.218643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3d2b560-1dd2-4c10-adba-974388d9af49-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.319256 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.319205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84pjh\" (UniqueName: \"kubernetes.io/projected/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-kube-api-access-84pjh\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.319256 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.319255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mmx\" (UniqueName: \"kubernetes.io/projected/6a033cfb-a232-46c8-8118-206ada51e43f-kube-api-access-l4mmx\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.319513 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.319288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.319513 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.319351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-sys\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.319513 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.319492 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-sys\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.319779 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:05.319757 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:05.320002 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:05.319991 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-tls podName:6a033cfb-a232-46c8-8118-206ada51e43f nodeName:}" failed. No retries permitted until 2026-04-17 16:34:05.819968906 +0000 UTC m=+179.168598440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-tls") pod "node-exporter-sqfzq" (UID: "6a033cfb-a232-46c8-8118-206ada51e43f") : secret "node-exporter-tls" not found Apr 17 16:34:05.320106 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.319429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-tls\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320215 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.320215 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.320215 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwkfd\" (UniqueName: \"kubernetes.io/projected/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-api-access-qwkfd\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.320370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.320370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a033cfb-a232-46c8-8118-206ada51e43f-metrics-client-ca\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.320370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-accelerators-collector-config\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e3d2b560-1dd2-4c10-adba-974388d9af49-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.320642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3d2b560-1dd2-4c10-adba-974388d9af49-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.320642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-textfile\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-wtmp\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-root\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.320642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.320567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.320977 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:05.320688 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 16:34:05.320977 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:05.320730 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-tls podName:e3d2b560-1dd2-4c10-adba-974388d9af49 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:05.8207157 +0000 UTC m=+179.169345232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-j2cfd" (UID: "e3d2b560-1dd2-4c10-adba-974388d9af49") : secret "kube-state-metrics-tls" not found Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.321152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:05.321214 2572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:05.321276 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-tls podName:a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:05.82125792 +0000 UTC m=+179.169887452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-l6rvm" (UID: "a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4") : secret "openshift-state-metrics-tls" not found Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.321767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.321877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3d2b560-1dd2-4c10-adba-974388d9af49-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.321955 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-wtmp\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.322252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a033cfb-a232-46c8-8118-206ada51e43f-metrics-client-ca\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.322308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6a033cfb-a232-46c8-8118-206ada51e43f-root\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.322371 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.322314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-accelerators-collector-config\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.322871 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.322540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e3d2b560-1dd2-4c10-adba-974388d9af49-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.322955 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.322920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.323279 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.323238 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.323640 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.323623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-textfile\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.325272 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.325251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.330301 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.330264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pjh\" (UniqueName: \"kubernetes.io/projected/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-kube-api-access-84pjh\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.334449 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.334406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mmx\" (UniqueName: \"kubernetes.io/projected/6a033cfb-a232-46c8-8118-206ada51e43f-kube-api-access-l4mmx\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.338514 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.338493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwkfd\" (UniqueName: \"kubernetes.io/projected/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-api-access-qwkfd\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.826420 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.826389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.826612 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.826434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-tls\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.826612 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.826569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:05.828701 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.828677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a033cfb-a232-46c8-8118-206ada51e43f-node-exporter-tls\") pod \"node-exporter-sqfzq\" (UID: \"6a033cfb-a232-46c8-8118-206ada51e43f\") " pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:05.828813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.828770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3d2b560-1dd2-4c10-adba-974388d9af49-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-j2cfd\" (UID: \"e3d2b560-1dd2-4c10-adba-974388d9af49\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:05.828974 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:05.828954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l6rvm\" (UID: \"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:06.032214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.032173 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" Apr 17 16:34:06.053681 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.050686 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sqfzq" Apr 17 16:34:06.058269 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.058240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" Apr 17 16:34:06.059995 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:34:06.059966 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a033cfb_a232_46c8_8118_206ada51e43f.slice/crio-689ea47ad6382b39f34b0901130fe80881b28a04f71b9fc95b2265cd37dce069 WatchSource:0}: Error finding container 689ea47ad6382b39f34b0901130fe80881b28a04f71b9fc95b2265cd37dce069: Status 404 returned error can't find the container with id 689ea47ad6382b39f34b0901130fe80881b28a04f71b9fc95b2265cd37dce069 Apr 17 16:34:06.176525 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.176472 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm"] Apr 17 16:34:06.179846 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:34:06.179814 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d8a523_9ad8_4c35_ae64_4f46dfbb6ce4.slice/crio-c7ed28bd1364b6ebd59f52ab65e7f65471f7a7ff97e96363085f5ccb762f117a WatchSource:0}: Error finding container c7ed28bd1364b6ebd59f52ab65e7f65471f7a7ff97e96363085f5ccb762f117a: Status 404 returned error can't find the container with id c7ed28bd1364b6ebd59f52ab65e7f65471f7a7ff97e96363085f5ccb762f117a Apr 17 16:34:06.200423 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.200400 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-j2cfd"] Apr 17 16:34:06.215850 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:34:06.215820 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d2b560_1dd2_4c10_adba_974388d9af49.slice/crio-e73ccbf4676202d83dbef10cd913b98ce6ab063765c21fd0c370ea3d8b799e08 WatchSource:0}: Error finding container e73ccbf4676202d83dbef10cd913b98ce6ab063765c21fd0c370ea3d8b799e08: Status 404 returned error can't find the container with id e73ccbf4676202d83dbef10cd913b98ce6ab063765c21fd0c370ea3d8b799e08 Apr 17 16:34:06.673056 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.673006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" event={"ID":"e3d2b560-1dd2-4c10-adba-974388d9af49","Type":"ContainerStarted","Data":"e73ccbf4676202d83dbef10cd913b98ce6ab063765c21fd0c370ea3d8b799e08"} Apr 17 16:34:06.674226 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.674192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sqfzq" event={"ID":"6a033cfb-a232-46c8-8118-206ada51e43f","Type":"ContainerStarted","Data":"689ea47ad6382b39f34b0901130fe80881b28a04f71b9fc95b2265cd37dce069"} Apr 17 16:34:06.675991 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.675964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" event={"ID":"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4","Type":"ContainerStarted","Data":"e74917a1b1fbdb3688ed457e4d64856708bf0516c258ff626780cf38945e096a"} Apr 17 16:34:06.676114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.675998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" event={"ID":"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4","Type":"ContainerStarted","Data":"88713bfaac32b5befac4735e664b4a95a90f0531aa9ede60294af6a8e2f375c5"} Apr 17 16:34:06.676114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:06.676011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" event={"ID":"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4","Type":"ContainerStarted","Data":"c7ed28bd1364b6ebd59f52ab65e7f65471f7a7ff97e96363085f5ccb762f117a"} Apr 17 16:34:07.680388 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:07.680350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" event={"ID":"e3d2b560-1dd2-4c10-adba-974388d9af49","Type":"ContainerStarted","Data":"3fab6ab0bc1629a0e2e485baf7af0b7b3cdfa8b8d68df1c42adf3ae51df8c571"} Apr 17 16:34:07.681786 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:07.681675 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sqfzq" event={"ID":"6a033cfb-a232-46c8-8118-206ada51e43f","Type":"ContainerStarted","Data":"7ff691d5ecdc77da0aa52e16fdd8d80430e8df73fcc3cfc2bbb7db5c681aaa9f"} Apr 17 16:34:07.684553 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:07.684520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" event={"ID":"a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4","Type":"ContainerStarted","Data":"8fe56e221a65b5c82fa512406e0039784049e67ca62a6a6a30657a82829e9dcf"} Apr 17 16:34:07.736959 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:07.736892 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l6rvm" podStartSLOduration=1.480779403 podStartE2EDuration="2.736870127s" podCreationTimestamp="2026-04-17 16:34:05 +0000 UTC" firstStartedPulling="2026-04-17 16:34:06.304902328 +0000 UTC m=+179.653531859" lastFinishedPulling="2026-04-17 16:34:07.560993052 +0000 UTC m=+180.909622583" observedRunningTime="2026-04-17 16:34:07.735680921 +0000 UTC m=+181.084310471" watchObservedRunningTime="2026-04-17 16:34:07.736870127 +0000 UTC m=+181.085499678" Apr 17 16:34:08.688629 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:08.688588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" event={"ID":"e3d2b560-1dd2-4c10-adba-974388d9af49","Type":"ContainerStarted","Data":"202953e4b53853d7a285d66e789cfac6ff2016c9236323182bf89760fff6a32f"} Apr 17 16:34:08.688629 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:08.688635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" event={"ID":"e3d2b560-1dd2-4c10-adba-974388d9af49","Type":"ContainerStarted","Data":"9be49ccde335ee4cd9d300104ef3062a8f2e686a6e6e5f665a1b8ffbbc119f62"} Apr 17 16:34:08.689993 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:08.689967 2572 generic.go:358] "Generic (PLEG): container finished" podID="6a033cfb-a232-46c8-8118-206ada51e43f" containerID="7ff691d5ecdc77da0aa52e16fdd8d80430e8df73fcc3cfc2bbb7db5c681aaa9f" exitCode=0 Apr 17 16:34:08.690103 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:08.690049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sqfzq" event={"ID":"6a033cfb-a232-46c8-8118-206ada51e43f","Type":"ContainerDied","Data":"7ff691d5ecdc77da0aa52e16fdd8d80430e8df73fcc3cfc2bbb7db5c681aaa9f"} Apr 17 16:34:08.713452 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:08.713377 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-j2cfd" podStartSLOduration=2.367289484 podStartE2EDuration="3.713362634s" podCreationTimestamp="2026-04-17 16:34:05 +0000 UTC" firstStartedPulling="2026-04-17 16:34:06.217862586 +0000 UTC m=+179.566492116" lastFinishedPulling="2026-04-17 16:34:07.563935739 +0000 UTC m=+180.912565266" observedRunningTime="2026-04-17 16:34:08.711683306 +0000 UTC m=+182.060312856" watchObservedRunningTime="2026-04-17 16:34:08.713362634 +0000 UTC m=+182.061992182" Apr 17 16:34:09.554579 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.554537 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7b855889f-bcqdb"] Apr 17 16:34:09.557383 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.557361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.562117 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.562086 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 16:34:09.562117 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.562095 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5lu1jv511n3vk\"" Apr 17 16:34:09.562310 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.562175 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-k7lzh\"" Apr 17 16:34:09.562310 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.562281 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:34:09.562494 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.562475 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 16:34:09.562553 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.562492 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 16:34:09.581025 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.580995 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b855889f-bcqdb"] Apr 17 16:34:09.657502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-secret-metrics-server-tls\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.657708 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.657708 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-client-ca-bundle\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.657708 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-metrics-server-audit-profiles\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.657708 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r4f6\" (UniqueName: \"kubernetes.io/projected/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-kube-api-access-5r4f6\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.657861 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-secret-metrics-server-client-certs\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.657861 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.657774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-audit-log\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.696532 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.696498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sqfzq" event={"ID":"6a033cfb-a232-46c8-8118-206ada51e43f","Type":"ContainerStarted","Data":"89552f3e707ff22da282652c3a2494c05795f32fec3379c494a79246c37df16b"} Apr 17 16:34:09.696532 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.696537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sqfzq" event={"ID":"6a033cfb-a232-46c8-8118-206ada51e43f","Type":"ContainerStarted","Data":"5652366946fbc40418ce3988267b8b88179428fc5e18b7e9bf003e24d652a1d8"} Apr 17 16:34:09.718579 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.718530 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sqfzq" podStartSLOduration=3.226275876 podStartE2EDuration="4.718514961s" podCreationTimestamp="2026-04-17 16:34:05 +0000 UTC" firstStartedPulling="2026-04-17 16:34:06.064825223 +0000 UTC m=+179.413454754" lastFinishedPulling="2026-04-17 16:34:07.557064309 +0000 UTC m=+180.905693839" observedRunningTime="2026-04-17 16:34:09.716993409 +0000 UTC m=+183.065622958" watchObservedRunningTime="2026-04-17 16:34:09.718514961 +0000 UTC m=+183.067144546" Apr 17 16:34:09.758812 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.758774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.758995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.758936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-client-ca-bundle\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-metrics-server-audit-profiles\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4f6\" (UniqueName: \"kubernetes.io/projected/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-kube-api-access-5r4f6\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-secret-metrics-server-client-certs\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-audit-log\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-secret-metrics-server-tls\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.759870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.759604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-audit-log\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.760369 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.760347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-metrics-server-audit-profiles\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.761691 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.761666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-client-ca-bundle\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.762177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.762157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-secret-metrics-server-client-certs\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.762177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.762172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-secret-metrics-server-tls\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.766589 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.766569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r4f6\" (UniqueName: \"kubernetes.io/projected/3f3c472e-29e8-4e7f-85de-f2381c6f9adc-kube-api-access-5r4f6\") pod \"metrics-server-7b855889f-bcqdb\" (UID: \"3f3c472e-29e8-4e7f-85de-f2381c6f9adc\") " pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.866874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.866789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:09.883937 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:09.878238 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f7fcd7469-99n2d"] Apr 17 16:34:10.001543 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:10.001509 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b855889f-bcqdb"] Apr 17 16:34:10.005071 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:34:10.005037 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f3c472e_29e8_4e7f_85de_f2381c6f9adc.slice/crio-e905af7b7e9020dce9ed27e5b3c469f0c359a007a68eef657ffc5685a397a881 WatchSource:0}: Error finding container e905af7b7e9020dce9ed27e5b3c469f0c359a007a68eef657ffc5685a397a881: Status 404 returned error can't find the container with id e905af7b7e9020dce9ed27e5b3c469f0c359a007a68eef657ffc5685a397a881 Apr 17 16:34:10.700529 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:10.700484 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" event={"ID":"3f3c472e-29e8-4e7f-85de-f2381c6f9adc","Type":"ContainerStarted","Data":"e905af7b7e9020dce9ed27e5b3c469f0c359a007a68eef657ffc5685a397a881"} Apr 17 16:34:11.704461 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:11.704426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" event={"ID":"3f3c472e-29e8-4e7f-85de-f2381c6f9adc","Type":"ContainerStarted","Data":"bf5d28d285b30c12531fa5863d957d21a8821287a5bf4bc140d4a82004a765a6"} Apr 17 16:34:11.730499 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:11.730432 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" podStartSLOduration=1.303107723 podStartE2EDuration="2.730416727s" podCreationTimestamp="2026-04-17 16:34:09 +0000 UTC" firstStartedPulling="2026-04-17 16:34:10.007471812 +0000 UTC m=+183.356101340" lastFinishedPulling="2026-04-17 16:34:11.434780817 +0000 UTC m=+184.783410344" observedRunningTime="2026-04-17 16:34:11.729840499 +0000 UTC m=+185.078470049" watchObservedRunningTime="2026-04-17 16:34:11.730416727 +0000 UTC m=+185.079046275" Apr 17 16:34:19.884295 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:19.884260 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:34:29.867531 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:29.867494 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:29.868027 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:29.867565 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:34.905961 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:34.905891 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" podUID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" containerName="registry" containerID="cri-o://6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0" gracePeriod=30 Apr 17 16:34:35.153642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.153620 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:34:35.273520 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273485 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-bound-sa-token\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.273520 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273521 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-image-registry-private-configuration\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.273825 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273552 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-certificates\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.273825 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273572 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.273825 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273714 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz556\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-kube-api-access-qz556\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.273825 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273769 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-trusted-ca\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.273825 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273790 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-ca-trust-extracted\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.274077 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.273832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-installation-pull-secrets\") pod \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\" (UID: \"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d\") " Apr 17 16:34:35.274143 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.274117 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:35.274210 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.274182 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:35.276186 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.276125 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:35.276324 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.276224 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:35.276448 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.276412 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-kube-api-access-qz556" (OuterVolumeSpecName: "kube-api-access-qz556") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "kube-api-access-qz556". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:35.276567 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.276523 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:35.276567 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.276540 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:35.283346 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.283318 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" (UID: "1fc318ea-c2d5-4ad6-ad45-b06f054ca89d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:35.375436 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375402 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-bound-sa-token\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375436 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375433 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-image-registry-private-configuration\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375436 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375443 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-certificates\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375689 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375455 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-registry-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375689 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375469 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qz556\" (UniqueName: \"kubernetes.io/projected/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-kube-api-access-qz556\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375689 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375478 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-trusted-ca\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375689 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375487 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-ca-trust-extracted\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.375689 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.375495 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d-installation-pull-secrets\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:34:35.767427 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.767394 2572 generic.go:358] "Generic (PLEG): container finished" podID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" containerID="6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0" exitCode=0 Apr 17 16:34:35.767599 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.767448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" event={"ID":"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d","Type":"ContainerDied","Data":"6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0"} Apr 17 16:34:35.767599 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.767454 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" Apr 17 16:34:35.767599 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.767470 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7fcd7469-99n2d" event={"ID":"1fc318ea-c2d5-4ad6-ad45-b06f054ca89d","Type":"ContainerDied","Data":"2b0f9cf535ccda40ff221ca8f27af329e8c95e2b15d0da838367101e0f549222"} Apr 17 16:34:35.767599 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.767485 2572 scope.go:117] "RemoveContainer" containerID="6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0" Apr 17 16:34:35.775320 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.775301 2572 scope.go:117] "RemoveContainer" containerID="6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0" Apr 17 16:34:35.775602 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:34:35.775577 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0\": container with ID starting with 6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0 not found: ID does not exist" containerID="6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0" Apr 17 16:34:35.775687 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.775615 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0"} err="failed to get container status \"6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0\": rpc error: code = NotFound desc = could not find container \"6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0\": container with ID starting with 6b8b5ad29634c1f0bbfe1e21f072139bb127d450dc4b01c712d1b1ef5f9753a0 not found: ID does not exist" Apr 17 16:34:35.785779 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.785752 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f7fcd7469-99n2d"] Apr 17 16:34:35.788383 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:35.788365 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f7fcd7469-99n2d"] Apr 17 16:34:36.048664 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.048574 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5564cddf8d-k7hvz"] Apr 17 16:34:36.049029 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.048872 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" containerName="registry" Apr 17 16:34:36.049029 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.048886 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" containerName="registry" Apr 17 16:34:36.049029 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.048938 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" containerName="registry" Apr 17 16:34:36.053809 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.053788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.056353 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.056321 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:34:36.056494 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.056452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:34:36.056494 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.056452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:34:36.057142 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.057121 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:34:36.057266 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.057146 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:34:36.057266 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.057157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fl6zn\"" Apr 17 16:34:36.057266 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.057235 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:34:36.057420 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.057393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:34:36.059563 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.059541 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5564cddf8d-k7hvz"] Apr 17 16:34:36.063037 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.063019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:34:36.182623 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-service-ca\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.182828 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-oauth-config\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.182828 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f54g\" (UniqueName: \"kubernetes.io/projected/d7fb3c3a-4947-459b-8b38-23bbbeaab354-kube-api-access-8f54g\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.182828 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-serving-cert\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.182828 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-trusted-ca-bundle\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.182828 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182826 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-config\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.182995 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.182859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-oauth-serving-cert\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284126 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-serving-cert\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284126 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-trusted-ca-bundle\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-config\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-oauth-serving-cert\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-service-ca\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-oauth-config\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.284377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.284368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f54g\" (UniqueName: \"kubernetes.io/projected/d7fb3c3a-4947-459b-8b38-23bbbeaab354-kube-api-access-8f54g\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.285132 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.285108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-service-ca\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.285244 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.285223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-oauth-serving-cert\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.285345 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.285327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-config\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.285378 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.285327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-trusted-ca-bundle\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.286776 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.286749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-serving-cert\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.286862 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.286773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-oauth-config\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.292808 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.292790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f54g\" (UniqueName: \"kubernetes.io/projected/d7fb3c3a-4947-459b-8b38-23bbbeaab354-kube-api-access-8f54g\") pod \"console-5564cddf8d-k7hvz\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.363952 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.363869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:36.482474 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.482439 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5564cddf8d-k7hvz"] Apr 17 16:34:36.485500 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:34:36.485472 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7fb3c3a_4947_459b_8b38_23bbbeaab354.slice/crio-60a45bfb481c936ebaaf69734330aef1a8cfc8166373472bd85dcb448bf3a0d9 WatchSource:0}: Error finding container 60a45bfb481c936ebaaf69734330aef1a8cfc8166373472bd85dcb448bf3a0d9: Status 404 returned error can't find the container with id 60a45bfb481c936ebaaf69734330aef1a8cfc8166373472bd85dcb448bf3a0d9 Apr 17 16:34:36.770997 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:36.770961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cddf8d-k7hvz" event={"ID":"d7fb3c3a-4947-459b-8b38-23bbbeaab354","Type":"ContainerStarted","Data":"60a45bfb481c936ebaaf69734330aef1a8cfc8166373472bd85dcb448bf3a0d9"} Apr 17 16:34:37.226568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:37.226532 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc318ea-c2d5-4ad6-ad45-b06f054ca89d" path="/var/lib/kubelet/pods/1fc318ea-c2d5-4ad6-ad45-b06f054ca89d/volumes" Apr 17 16:34:39.781399 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:39.781363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cddf8d-k7hvz" event={"ID":"d7fb3c3a-4947-459b-8b38-23bbbeaab354","Type":"ContainerStarted","Data":"3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0"} Apr 17 16:34:39.800946 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:39.800899 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5564cddf8d-k7hvz" podStartSLOduration=1.377456502 podStartE2EDuration="3.800884886s" podCreationTimestamp="2026-04-17 16:34:36 +0000 UTC" firstStartedPulling="2026-04-17 16:34:36.487360262 +0000 UTC m=+209.835989789" lastFinishedPulling="2026-04-17 16:34:38.910788632 +0000 UTC m=+212.259418173" observedRunningTime="2026-04-17 16:34:39.799905325 +0000 UTC m=+213.148534875" watchObservedRunningTime="2026-04-17 16:34:39.800884886 +0000 UTC m=+213.149514435" Apr 17 16:34:46.364289 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:46.364236 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:46.364289 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:46.364296 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:46.369223 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:46.369203 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:46.802126 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:46.802101 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:34:49.872398 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:49.872366 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:34:49.876384 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:34:49.876356 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b855889f-bcqdb" Apr 17 16:35:19.108339 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:19.108293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:35:19.110626 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:19.110602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/000f5549-91dd-4651-b5a0-21769e3982f4-metrics-certs\") pod \"network-metrics-daemon-w6ttr\" (UID: \"000f5549-91dd-4651-b5a0-21769e3982f4\") " pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:35:19.125847 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:19.125823 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xlgkd\"" Apr 17 16:35:19.133900 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:19.133887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w6ttr" Apr 17 16:35:19.250209 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:19.250186 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w6ttr"] Apr 17 16:35:19.252884 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:35:19.252849 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000f5549_91dd_4651_b5a0_21769e3982f4.slice/crio-38ba6ced8cd0714030d2e58b9c0923643abfe22a867bacfb14c7e9dbe33311f9 WatchSource:0}: Error finding container 38ba6ced8cd0714030d2e58b9c0923643abfe22a867bacfb14c7e9dbe33311f9: Status 404 returned error can't find the container with id 38ba6ced8cd0714030d2e58b9c0923643abfe22a867bacfb14c7e9dbe33311f9 Apr 17 16:35:19.882475 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:19.882441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w6ttr" event={"ID":"000f5549-91dd-4651-b5a0-21769e3982f4","Type":"ContainerStarted","Data":"38ba6ced8cd0714030d2e58b9c0923643abfe22a867bacfb14c7e9dbe33311f9"} Apr 17 16:35:20.886521 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:20.886486 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w6ttr" event={"ID":"000f5549-91dd-4651-b5a0-21769e3982f4","Type":"ContainerStarted","Data":"57173308e967dd3ac0895081e51c4645bd88fde83bd519792b2a74abdb8c719c"} Apr 17 16:35:20.886521 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:20.886520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w6ttr" event={"ID":"000f5549-91dd-4651-b5a0-21769e3982f4","Type":"ContainerStarted","Data":"02246c35318397747ca142527e90e8e8827c8bba1f7ccab9eaf80a92c9d9b2bb"} Apr 17 16:35:20.903593 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:20.903547 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-w6ttr" podStartSLOduration=252.830582187 podStartE2EDuration="4m13.903531618s" podCreationTimestamp="2026-04-17 16:31:07 +0000 UTC" firstStartedPulling="2026-04-17 16:35:19.254599098 +0000 UTC m=+252.603228628" lastFinishedPulling="2026-04-17 16:35:20.327548519 +0000 UTC m=+253.676178059" observedRunningTime="2026-04-17 16:35:20.901804849 +0000 UTC m=+254.250434397" watchObservedRunningTime="2026-04-17 16:35:20.903531618 +0000 UTC m=+254.252161166" Apr 17 16:35:44.212968 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:35:44.212887 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5564cddf8d-k7hvz"] Apr 17 16:36:07.146451 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:07.146427 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:36:09.232928 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.232871 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5564cddf8d-k7hvz" podUID="d7fb3c3a-4947-459b-8b38-23bbbeaab354" containerName="console" containerID="cri-o://3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0" gracePeriod=15 Apr 17 16:36:09.466012 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.465990 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5564cddf8d-k7hvz_d7fb3c3a-4947-459b-8b38-23bbbeaab354/console/0.log" Apr 17 16:36:09.466114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.466048 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:36:09.566867 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566789 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-serving-cert\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.566867 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566822 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-oauth-serving-cert\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.566867 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566850 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-oauth-config\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.567116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566880 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f54g\" (UniqueName: \"kubernetes.io/projected/d7fb3c3a-4947-459b-8b38-23bbbeaab354-kube-api-access-8f54g\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.567116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566894 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-trusted-ca-bundle\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.567116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566912 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-service-ca\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.567116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.566942 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-config\") pod \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\" (UID: \"d7fb3c3a-4947-459b-8b38-23bbbeaab354\") " Apr 17 16:36:09.567422 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.567375 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:09.567422 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.567386 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:09.567422 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.567399 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:09.567422 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.567409 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-config" (OuterVolumeSpecName: "console-config") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:09.569102 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.569077 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:36:09.569179 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.569125 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7fb3c3a-4947-459b-8b38-23bbbeaab354-kube-api-access-8f54g" (OuterVolumeSpecName: "kube-api-access-8f54g") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "kube-api-access-8f54g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:36:09.569223 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.569195 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d7fb3c3a-4947-459b-8b38-23bbbeaab354" (UID: "d7fb3c3a-4947-459b-8b38-23bbbeaab354"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:36:09.667907 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667875 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8f54g\" (UniqueName: \"kubernetes.io/projected/d7fb3c3a-4947-459b-8b38-23bbbeaab354-kube-api-access-8f54g\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:09.667907 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667901 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-trusted-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:09.667907 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667911 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-service-ca\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:09.668114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667921 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:09.668114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667929 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-serving-cert\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:09.668114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667938 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7fb3c3a-4947-459b-8b38-23bbbeaab354-oauth-serving-cert\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:09.668114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:09.667947 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7fb3c3a-4947-459b-8b38-23bbbeaab354-console-oauth-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:36:10.015761 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.015733 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5564cddf8d-k7hvz_d7fb3c3a-4947-459b-8b38-23bbbeaab354/console/0.log" Apr 17 16:36:10.015916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.015771 2572 generic.go:358] "Generic (PLEG): container finished" podID="d7fb3c3a-4947-459b-8b38-23bbbeaab354" containerID="3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0" exitCode=2 Apr 17 16:36:10.015916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.015845 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cddf8d-k7hvz" Apr 17 16:36:10.015916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.015853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cddf8d-k7hvz" event={"ID":"d7fb3c3a-4947-459b-8b38-23bbbeaab354","Type":"ContainerDied","Data":"3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0"} Apr 17 16:36:10.015916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.015893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cddf8d-k7hvz" event={"ID":"d7fb3c3a-4947-459b-8b38-23bbbeaab354","Type":"ContainerDied","Data":"60a45bfb481c936ebaaf69734330aef1a8cfc8166373472bd85dcb448bf3a0d9"} Apr 17 16:36:10.015916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.015912 2572 scope.go:117] "RemoveContainer" containerID="3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0" Apr 17 16:36:10.027154 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.027132 2572 scope.go:117] "RemoveContainer" containerID="3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0" Apr 17 16:36:10.027388 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:36:10.027367 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0\": container with ID starting with 3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0 not found: ID does not exist" containerID="3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0" Apr 17 16:36:10.027445 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.027398 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0"} err="failed to get container status \"3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0\": rpc error: code = NotFound desc = could not find container \"3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0\": container with ID starting with 3081f2a2f8b539e6e85302aebefc98c0dadb74b446a07c79122228d802024ee0 not found: ID does not exist" Apr 17 16:36:10.039183 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.039156 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5564cddf8d-k7hvz"] Apr 17 16:36:10.043288 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:10.043261 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5564cddf8d-k7hvz"] Apr 17 16:36:11.225903 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:11.225871 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7fb3c3a-4947-459b-8b38-23bbbeaab354" path="/var/lib/kubelet/pods/d7fb3c3a-4947-459b-8b38-23bbbeaab354/volumes" Apr 17 16:36:54.345226 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.345194 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78b47d8cf5-hflcj"] Apr 17 16:36:54.345705 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.345435 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7fb3c3a-4947-459b-8b38-23bbbeaab354" containerName="console" Apr 17 16:36:54.345705 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.345449 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7fb3c3a-4947-459b-8b38-23bbbeaab354" containerName="console" Apr 17 16:36:54.345705 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.345499 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7fb3c3a-4947-459b-8b38-23bbbeaab354" containerName="console" Apr 17 16:36:54.348243 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.348222 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.352639 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.352619 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:36:54.352932 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.352911 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:36:54.353032 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.352944 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:36:54.353032 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.353012 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fl6zn\"" Apr 17 16:36:54.353149 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.353075 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:36:54.354005 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.353978 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:36:54.354005 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.353993 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:36:54.354164 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.354013 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:36:54.358224 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.358196 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:36:54.358759 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.358741 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78b47d8cf5-hflcj"] Apr 17 16:36:54.479697 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-oauth-config\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.479697 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-oauth-serving-cert\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.479876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-trusted-ca-bundle\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.479876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsj6h\" (UniqueName: \"kubernetes.io/projected/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-kube-api-access-nsj6h\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.479876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-config\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.479876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-service-ca\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.479876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.479824 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-serving-cert\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.580722 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.580689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-oauth-config\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.580874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.580724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-oauth-serving-cert\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.580874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.580750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-trusted-ca-bundle\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.580874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.580775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsj6h\" (UniqueName: \"kubernetes.io/projected/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-kube-api-access-nsj6h\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.580874 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.580809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-config\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.581057 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.580959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-service-ca\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.581057 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.581009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-serving-cert\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.581529 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.581502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-oauth-serving-cert\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.581644 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.581505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-config\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.581644 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.581581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-service-ca\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.581958 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.581939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-trusted-ca-bundle\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.583080 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.583061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-oauth-config\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.583293 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.583277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-serving-cert\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.588798 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.588777 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsj6h\" (UniqueName: \"kubernetes.io/projected/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-kube-api-access-nsj6h\") pod \"console-78b47d8cf5-hflcj\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.659477 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.659415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:36:54.780872 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.780846 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78b47d8cf5-hflcj"] Apr 17 16:36:54.783399 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:36:54.783371 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode942f2e9_b0d4_4ca4_a20b_3c4156c33f2a.slice/crio-5d0934fce9010c2a1ff2288e8462942a7013afbb4e203c1d160527f4fa6202f5 WatchSource:0}: Error finding container 5d0934fce9010c2a1ff2288e8462942a7013afbb4e203c1d160527f4fa6202f5: Status 404 returned error can't find the container with id 5d0934fce9010c2a1ff2288e8462942a7013afbb4e203c1d160527f4fa6202f5 Apr 17 16:36:54.785255 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:54.785239 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:36:55.128801 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:55.128767 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78b47d8cf5-hflcj" event={"ID":"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a","Type":"ContainerStarted","Data":"47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8"} Apr 17 16:36:55.128801 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:55.128805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78b47d8cf5-hflcj" event={"ID":"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a","Type":"ContainerStarted","Data":"5d0934fce9010c2a1ff2288e8462942a7013afbb4e203c1d160527f4fa6202f5"} Apr 17 16:36:55.146137 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:36:55.146087 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78b47d8cf5-hflcj" podStartSLOduration=1.146073559 podStartE2EDuration="1.146073559s" podCreationTimestamp="2026-04-17 16:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:36:55.144711218 +0000 UTC m=+348.493340814" watchObservedRunningTime="2026-04-17 16:36:55.146073559 +0000 UTC m=+348.494703108" Apr 17 16:37:04.659968 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:04.659871 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:37:04.659968 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:04.659938 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:37:04.664722 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:04.664699 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:37:05.159847 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:05.159819 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:37:52.777432 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.777393 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t"] Apr 17 16:37:52.780542 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.780525 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.785966 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.785917 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:37:52.786116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.785917 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:37:52.786116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.786021 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:37:52.786876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.786854 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 16:37:52.800409 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.800383 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t"] Apr 17 16:37:52.800409 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.800382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ce8b9359-50be-4712-9665-fc5c377f90f3-klusterlet-config\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.800552 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.800456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce8b9359-50be-4712-9665-fc5c377f90f3-tmp\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.800552 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.800493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpb5k\" (UniqueName: \"kubernetes.io/projected/ce8b9359-50be-4712-9665-fc5c377f90f3-kube-api-access-wpb5k\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.839258 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.839231 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t"] Apr 17 16:37:52.843289 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.843261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.846517 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.846486 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 16:37:52.847628 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.847614 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 16:37:52.847727 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.847660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 16:37:52.848578 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.848559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 16:37:52.863171 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.863149 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t"] Apr 17 16:37:52.900844 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.900820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce8b9359-50be-4712-9665-fc5c377f90f3-tmp\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.900990 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.900855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-ca\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.900990 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.900887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpb5k\" (UniqueName: \"kubernetes.io/projected/ce8b9359-50be-4712-9665-fc5c377f90f3-kube-api-access-wpb5k\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.900990 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.900948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/537de27d-cb5a-4ec4-a989-1d5da633ea18-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.901163 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.901092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.901163 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.901135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcs7c\" (UniqueName: \"kubernetes.io/projected/537de27d-cb5a-4ec4-a989-1d5da633ea18-kube-api-access-pcs7c\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.901251 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.901165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.901251 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.901218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce8b9359-50be-4712-9665-fc5c377f90f3-tmp\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.901251 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.901235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-hub\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:52.901353 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.901271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ce8b9359-50be-4712-9665-fc5c377f90f3-klusterlet-config\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.903468 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.903444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ce8b9359-50be-4712-9665-fc5c377f90f3-klusterlet-config\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:52.913860 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:52.913842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpb5k\" (UniqueName: \"kubernetes.io/projected/ce8b9359-50be-4712-9665-fc5c377f90f3-kube-api-access-wpb5k\") pod \"klusterlet-addon-workmgr-7dd98d886f-z5p5t\" (UID: \"ce8b9359-50be-4712-9665-fc5c377f90f3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:53.002351 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.002310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-ca\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.002511 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.002361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/537de27d-cb5a-4ec4-a989-1d5da633ea18-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.002511 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.002482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.002624 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.002530 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcs7c\" (UniqueName: \"kubernetes.io/projected/537de27d-cb5a-4ec4-a989-1d5da633ea18-kube-api-access-pcs7c\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.002624 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.002568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.002763 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.002624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-hub\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.003073 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.003046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/537de27d-cb5a-4ec4-a989-1d5da633ea18-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.004859 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.004838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.004980 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.004964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-hub\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.005075 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.005054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-ca\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.005149 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.005125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/537de27d-cb5a-4ec4-a989-1d5da633ea18-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.015927 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.015905 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcs7c\" (UniqueName: \"kubernetes.io/projected/537de27d-cb5a-4ec4-a989-1d5da633ea18-kube-api-access-pcs7c\") pod \"cluster-proxy-proxy-agent-f77c89c75-f6h6t\" (UID: \"537de27d-cb5a-4ec4-a989-1d5da633ea18\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.089053 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.088986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:53.168060 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.167989 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" Apr 17 16:37:53.202225 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.202190 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t"] Apr 17 16:37:53.205048 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:37:53.205020 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce8b9359_50be_4712_9665_fc5c377f90f3.slice/crio-492e085a1facdce5641bf8d6e64c4f0671ef69def5367b12752cfc14628e69eb WatchSource:0}: Error finding container 492e085a1facdce5641bf8d6e64c4f0671ef69def5367b12752cfc14628e69eb: Status 404 returned error can't find the container with id 492e085a1facdce5641bf8d6e64c4f0671ef69def5367b12752cfc14628e69eb Apr 17 16:37:53.273455 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.273421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" event={"ID":"ce8b9359-50be-4712-9665-fc5c377f90f3","Type":"ContainerStarted","Data":"492e085a1facdce5641bf8d6e64c4f0671ef69def5367b12752cfc14628e69eb"} Apr 17 16:37:53.293917 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:53.293896 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t"] Apr 17 16:37:53.296109 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:37:53.296087 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod537de27d_cb5a_4ec4_a989_1d5da633ea18.slice/crio-b7528777012fae052cc28f8fd8577697a7c8ba015f4ecefb1da8219ded01070b WatchSource:0}: Error finding container b7528777012fae052cc28f8fd8577697a7c8ba015f4ecefb1da8219ded01070b: Status 404 returned error can't find the container with id b7528777012fae052cc28f8fd8577697a7c8ba015f4ecefb1da8219ded01070b Apr 17 16:37:54.279608 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:54.279567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" event={"ID":"537de27d-cb5a-4ec4-a989-1d5da633ea18","Type":"ContainerStarted","Data":"b7528777012fae052cc28f8fd8577697a7c8ba015f4ecefb1da8219ded01070b"} Apr 17 16:37:58.295931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:58.295878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" event={"ID":"ce8b9359-50be-4712-9665-fc5c377f90f3","Type":"ContainerStarted","Data":"6e01d92a3301e652f9a0be665b9f9cbbc7185c69e04ca339a0a20b5ed73dab16"} Apr 17 16:37:58.296366 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:58.296173 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:58.297996 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:58.297976 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" Apr 17 16:37:58.298244 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:58.298220 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" event={"ID":"537de27d-cb5a-4ec4-a989-1d5da633ea18","Type":"ContainerStarted","Data":"7f8d69cd408c821b3e73d247c0f7a2c4e4730ba27c770f2e55fbc42f9b7e9adb"} Apr 17 16:37:58.317055 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:37:58.316991 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7dd98d886f-z5p5t" podStartSLOduration=1.987778131 podStartE2EDuration="6.316973689s" podCreationTimestamp="2026-04-17 16:37:52 +0000 UTC" firstStartedPulling="2026-04-17 16:37:53.206893236 +0000 UTC m=+406.555522764" lastFinishedPulling="2026-04-17 16:37:57.536088795 +0000 UTC m=+410.884718322" observedRunningTime="2026-04-17 16:37:58.314591322 +0000 UTC m=+411.663220872" watchObservedRunningTime="2026-04-17 16:37:58.316973689 +0000 UTC m=+411.665603240" Apr 17 16:38:00.305735 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:38:00.305698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" event={"ID":"537de27d-cb5a-4ec4-a989-1d5da633ea18","Type":"ContainerStarted","Data":"9ac45a3620994ac9e330c9e3168e66fd3687200128bd0155f21ad87a169de92a"} Apr 17 16:38:00.305735 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:38:00.305740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" event={"ID":"537de27d-cb5a-4ec4-a989-1d5da633ea18","Type":"ContainerStarted","Data":"188d1496fabee84660cb62b39f298e0f0c462d587b1e4f3d4b1a4a10c0cd0d3b"} Apr 17 16:38:00.340264 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:38:00.340213 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f77c89c75-f6h6t" podStartSLOduration=1.948737558 podStartE2EDuration="8.340198481s" podCreationTimestamp="2026-04-17 16:37:52 +0000 UTC" firstStartedPulling="2026-04-17 16:37:53.297707204 +0000 UTC m=+406.646336735" lastFinishedPulling="2026-04-17 16:37:59.689168131 +0000 UTC m=+413.037797658" observedRunningTime="2026-04-17 16:38:00.338430332 +0000 UTC m=+413.687059881" watchObservedRunningTime="2026-04-17 16:38:00.340198481 +0000 UTC m=+413.688828041" Apr 17 16:40:50.141953 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:40:50.141911 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78b47d8cf5-hflcj"] Apr 17 16:41:10.120130 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.120101 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-pczd5"] Apr 17 16:41:10.123311 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.123294 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.131809 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.131783 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:41:10.132516 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.132497 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-2hwvn\"" Apr 17 16:41:10.133716 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.133694 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:41:10.139840 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.139822 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 16:41:10.153806 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.153782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8djvw\" (UniqueName: \"kubernetes.io/projected/d1ed091d-6054-47d0-8770-ad78f8a1729e-kube-api-access-8djvw\") pod \"model-serving-api-86f7b4b499-pczd5\" (UID: \"d1ed091d-6054-47d0-8770-ad78f8a1729e\") " pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.154003 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.153985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ed091d-6054-47d0-8770-ad78f8a1729e-tls-certs\") pod \"model-serving-api-86f7b4b499-pczd5\" (UID: \"d1ed091d-6054-47d0-8770-ad78f8a1729e\") " pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.155356 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.155329 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-pczd5"] Apr 17 16:41:10.254951 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.254924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8djvw\" (UniqueName: \"kubernetes.io/projected/d1ed091d-6054-47d0-8770-ad78f8a1729e-kube-api-access-8djvw\") pod \"model-serving-api-86f7b4b499-pczd5\" (UID: \"d1ed091d-6054-47d0-8770-ad78f8a1729e\") " pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.255090 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.254959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ed091d-6054-47d0-8770-ad78f8a1729e-tls-certs\") pod \"model-serving-api-86f7b4b499-pczd5\" (UID: \"d1ed091d-6054-47d0-8770-ad78f8a1729e\") " pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.257582 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.257554 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ed091d-6054-47d0-8770-ad78f8a1729e-tls-certs\") pod \"model-serving-api-86f7b4b499-pczd5\" (UID: \"d1ed091d-6054-47d0-8770-ad78f8a1729e\") " pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.265778 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.265755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8djvw\" (UniqueName: \"kubernetes.io/projected/d1ed091d-6054-47d0-8770-ad78f8a1729e-kube-api-access-8djvw\") pod \"model-serving-api-86f7b4b499-pczd5\" (UID: \"d1ed091d-6054-47d0-8770-ad78f8a1729e\") " pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.435734 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.435638 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:10.558003 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.557848 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-pczd5"] Apr 17 16:41:10.560488 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:41:10.560461 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ed091d_6054_47d0_8770_ad78f8a1729e.slice/crio-37580c004007d05fe93cd8f21ef164ed28e02d55cddce54ad1682f1f298cbd1a WatchSource:0}: Error finding container 37580c004007d05fe93cd8f21ef164ed28e02d55cddce54ad1682f1f298cbd1a: Status 404 returned error can't find the container with id 37580c004007d05fe93cd8f21ef164ed28e02d55cddce54ad1682f1f298cbd1a Apr 17 16:41:10.814483 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:10.814446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-pczd5" event={"ID":"d1ed091d-6054-47d0-8770-ad78f8a1729e","Type":"ContainerStarted","Data":"37580c004007d05fe93cd8f21ef164ed28e02d55cddce54ad1682f1f298cbd1a"} Apr 17 16:41:13.825102 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:13.825058 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-pczd5" event={"ID":"d1ed091d-6054-47d0-8770-ad78f8a1729e","Type":"ContainerStarted","Data":"698ce975c39f92b96d9596844f76190c31309110b4bfff92cc1e6b474e9aaed6"} Apr 17 16:41:13.825604 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:13.825319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:13.842897 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:13.842847 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-pczd5" podStartSLOduration=1.6332949970000001 podStartE2EDuration="3.842835048s" podCreationTimestamp="2026-04-17 16:41:10 +0000 UTC" firstStartedPulling="2026-04-17 16:41:10.562180235 +0000 UTC m=+603.910809762" lastFinishedPulling="2026-04-17 16:41:12.771720286 +0000 UTC m=+606.120349813" observedRunningTime="2026-04-17 16:41:13.841367507 +0000 UTC m=+607.189997057" watchObservedRunningTime="2026-04-17 16:41:13.842835048 +0000 UTC m=+607.191464596" Apr 17 16:41:15.161013 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.160966 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78b47d8cf5-hflcj" podUID="e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" containerName="console" containerID="cri-o://47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8" gracePeriod=15 Apr 17 16:41:15.397523 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.397503 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78b47d8cf5-hflcj_e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a/console/0.log" Apr 17 16:41:15.397628 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.397563 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:41:15.497491 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497454 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-config\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.497707 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497514 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-oauth-config\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.497707 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497534 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-trusted-ca-bundle\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.497707 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497556 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-serving-cert\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.497707 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497589 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-service-ca\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.497707 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497604 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-oauth-serving-cert\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.497707 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.497620 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsj6h\" (UniqueName: \"kubernetes.io/projected/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-kube-api-access-nsj6h\") pod \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\" (UID: \"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a\") " Apr 17 16:41:15.498067 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.498042 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:41:15.498129 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.498102 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:41:15.498165 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.498116 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-service-ca" (OuterVolumeSpecName: "service-ca") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:41:15.498287 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.498264 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-config" (OuterVolumeSpecName: "console-config") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:41:15.499724 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.499689 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:41:15.499811 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.499748 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:41:15.499863 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.499823 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-kube-api-access-nsj6h" (OuterVolumeSpecName: "kube-api-access-nsj6h") pod "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" (UID: "e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a"). InnerVolumeSpecName "kube-api-access-nsj6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:15.598870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598833 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-oauth-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.598870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598861 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-trusted-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.598870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598871 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-serving-cert\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.599083 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598880 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-service-ca\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.599083 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598890 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-oauth-serving-cert\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.599083 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598899 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsj6h\" (UniqueName: \"kubernetes.io/projected/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-kube-api-access-nsj6h\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.599083 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.598908 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a-console-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:41:15.831820 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.831735 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78b47d8cf5-hflcj_e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a/console/0.log" Apr 17 16:41:15.831820 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.831776 2572 generic.go:358] "Generic (PLEG): container finished" podID="e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" containerID="47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8" exitCode=2 Apr 17 16:41:15.831820 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.831807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78b47d8cf5-hflcj" event={"ID":"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a","Type":"ContainerDied","Data":"47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8"} Apr 17 16:41:15.832090 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.831845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78b47d8cf5-hflcj" event={"ID":"e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a","Type":"ContainerDied","Data":"5d0934fce9010c2a1ff2288e8462942a7013afbb4e203c1d160527f4fa6202f5"} Apr 17 16:41:15.832090 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.831865 2572 scope.go:117] "RemoveContainer" containerID="47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8" Apr 17 16:41:15.832090 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.831866 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78b47d8cf5-hflcj" Apr 17 16:41:15.840432 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.840416 2572 scope.go:117] "RemoveContainer" containerID="47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8" Apr 17 16:41:15.840675 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:41:15.840637 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8\": container with ID starting with 47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8 not found: ID does not exist" containerID="47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8" Apr 17 16:41:15.840746 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.840675 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8"} err="failed to get container status \"47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8\": rpc error: code = NotFound desc = could not find container \"47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8\": container with ID starting with 47acb52a051725989d434796abe2549b362a287860fae1142258eb3daad712e8 not found: ID does not exist" Apr 17 16:41:15.854260 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.854240 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78b47d8cf5-hflcj"] Apr 17 16:41:15.857782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:15.857758 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78b47d8cf5-hflcj"] Apr 17 16:41:17.225427 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:17.225393 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" path="/var/lib/kubelet/pods/e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a/volumes" Apr 17 16:41:24.831998 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:24.831972 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-pczd5" Apr 17 16:41:44.199572 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.199486 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz"] Apr 17 16:41:44.200109 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.199873 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" containerName="console" Apr 17 16:41:44.200109 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.199888 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" containerName="console" Apr 17 16:41:44.200109 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.199951 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e942f2e9-b0d4-4ca4-a20b-3c4156c33f2a" containerName="console" Apr 17 16:41:44.206450 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.206432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.209563 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.209378 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9e0ff-predictor-serving-cert\"" Apr 17 16:41:44.209563 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.209435 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\"" Apr 17 16:41:44.209563 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.209480 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:41:44.209563 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.209552 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:41:44.210167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.210143 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tvtvv\"" Apr 17 16:41:44.210636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.210615 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz"] Apr 17 16:41:44.309678 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.309631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52b1d4cc-638e-41b1-9fdb-63687ef13969-success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.309836 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.309683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52b1d4cc-638e-41b1-9fdb-63687ef13969-proxy-tls\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.309836 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.309761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch42f\" (UniqueName: \"kubernetes.io/projected/52b1d4cc-638e-41b1-9fdb-63687ef13969-kube-api-access-ch42f\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.410853 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.410818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch42f\" (UniqueName: \"kubernetes.io/projected/52b1d4cc-638e-41b1-9fdb-63687ef13969-kube-api-access-ch42f\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.411042 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.410897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52b1d4cc-638e-41b1-9fdb-63687ef13969-success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.411042 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.410939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52b1d4cc-638e-41b1-9fdb-63687ef13969-proxy-tls\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.412149 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.411689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52b1d4cc-638e-41b1-9fdb-63687ef13969-success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.413987 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.413953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52b1d4cc-638e-41b1-9fdb-63687ef13969-proxy-tls\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.431035 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.431010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch42f\" (UniqueName: \"kubernetes.io/projected/52b1d4cc-638e-41b1-9fdb-63687ef13969-kube-api-access-ch42f\") pod \"success-200-isvc-9e0ff-predictor-598855d998-zbbsz\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.482685 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.482600 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv"] Apr 17 16:41:44.486606 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.486590 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.488881 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.488856 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 17 16:41:44.488999 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.488920 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 17 16:41:44.496163 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.496139 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv"] Apr 17 16:41:44.518303 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.518278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:41:44.612481 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.612411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.612637 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.612500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxrg\" (UniqueName: \"kubernetes.io/projected/2806963c-78df-401c-b5e5-ab8166c86d7f-kube-api-access-8nxrg\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.612637 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.612539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2806963c-78df-401c-b5e5-ab8166c86d7f-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.612637 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.612568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2806963c-78df-401c-b5e5-ab8166c86d7f-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.643613 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.643580 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz"] Apr 17 16:41:44.647786 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:41:44.647747 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b1d4cc_638e_41b1_9fdb_63687ef13969.slice/crio-9032333b71eaa3e4e1715d3448b92df152867ff5b7235aef609585d570d107e3 WatchSource:0}: Error finding container 9032333b71eaa3e4e1715d3448b92df152867ff5b7235aef609585d570d107e3: Status 404 returned error can't find the container with id 9032333b71eaa3e4e1715d3448b92df152867ff5b7235aef609585d570d107e3 Apr 17 16:41:44.661618 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.661591 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l"] Apr 17 16:41:44.666991 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.666972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.669610 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.669588 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9e0ff-predictor-serving-cert\"" Apr 17 16:41:44.669766 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.669588 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\"" Apr 17 16:41:44.693070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.693047 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l"] Apr 17 16:41:44.713743 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxrg\" (UniqueName: \"kubernetes.io/projected/2806963c-78df-401c-b5e5-ab8166c86d7f-kube-api-access-8nxrg\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.713872 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mj82\" (UniqueName: \"kubernetes.io/projected/cca92b8b-ec77-44db-88bc-3cbada2e6604-kube-api-access-2mj82\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.713872 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2806963c-78df-401c-b5e5-ab8166c86d7f-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.713872 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2806963c-78df-401c-b5e5-ab8166c86d7f-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.714036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cca92b8b-ec77-44db-88bc-3cbada2e6604-error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.714036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.714036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.713987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca92b8b-ec77-44db-88bc-3cbada2e6604-proxy-tls\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.717725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.714469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2806963c-78df-401c-b5e5-ab8166c86d7f-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.717725 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:41:44.714526 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-predictor-serving-cert: secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 17 16:41:44.717725 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:41:44.714602 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls podName:2806963c-78df-401c-b5e5-ab8166c86d7f nodeName:}" failed. No retries permitted until 2026-04-17 16:41:45.214583298 +0000 UTC m=+638.563212827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls") pod "isvc-xgboost-graph-predictor-669d8d6456-qgmdv" (UID: "2806963c-78df-401c-b5e5-ab8166c86d7f") : secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 17 16:41:44.717725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.714959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2806963c-78df-401c-b5e5-ab8166c86d7f-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.724986 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.724966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxrg\" (UniqueName: \"kubernetes.io/projected/2806963c-78df-401c-b5e5-ab8166c86d7f-kube-api-access-8nxrg\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:44.814868 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.814793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cca92b8b-ec77-44db-88bc-3cbada2e6604-error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.814868 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.814838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca92b8b-ec77-44db-88bc-3cbada2e6604-proxy-tls\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.815068 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.814874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mj82\" (UniqueName: \"kubernetes.io/projected/cca92b8b-ec77-44db-88bc-3cbada2e6604-kube-api-access-2mj82\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.815423 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.815400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cca92b8b-ec77-44db-88bc-3cbada2e6604-error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.817203 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.817181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca92b8b-ec77-44db-88bc-3cbada2e6604-proxy-tls\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.822336 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.822313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mj82\" (UniqueName: \"kubernetes.io/projected/cca92b8b-ec77-44db-88bc-3cbada2e6604-kube-api-access-2mj82\") pod \"error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:44.909740 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.909700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" event={"ID":"52b1d4cc-638e-41b1-9fdb-63687ef13969","Type":"ContainerStarted","Data":"9032333b71eaa3e4e1715d3448b92df152867ff5b7235aef609585d570d107e3"} Apr 17 16:41:44.979846 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:44.979816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:41:45.095028 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.094953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l"] Apr 17 16:41:45.098838 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:41:45.098804 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca92b8b_ec77_44db_88bc_3cbada2e6604.slice/crio-bed10f845ae2562ed615debb306588d19e67ac5fde0bd6e5803f5e73f24ad9ed WatchSource:0}: Error finding container bed10f845ae2562ed615debb306588d19e67ac5fde0bd6e5803f5e73f24ad9ed: Status 404 returned error can't find the container with id bed10f845ae2562ed615debb306588d19e67ac5fde0bd6e5803f5e73f24ad9ed Apr 17 16:41:45.218807 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.218776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:45.221186 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.221161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qgmdv\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:45.400586 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.400504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:41:45.583784 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.583481 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv"] Apr 17 16:41:45.584880 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:41:45.584843 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2806963c_78df_401c_b5e5_ab8166c86d7f.slice/crio-4519e2b6201e92d33e1fa542003e076eabb121ea30ea85c6df5a51c504f9ec47 WatchSource:0}: Error finding container 4519e2b6201e92d33e1fa542003e076eabb121ea30ea85c6df5a51c504f9ec47: Status 404 returned error can't find the container with id 4519e2b6201e92d33e1fa542003e076eabb121ea30ea85c6df5a51c504f9ec47 Apr 17 16:41:45.922731 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.922677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerStarted","Data":"4519e2b6201e92d33e1fa542003e076eabb121ea30ea85c6df5a51c504f9ec47"} Apr 17 16:41:45.928677 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:45.928575 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" event={"ID":"cca92b8b-ec77-44db-88bc-3cbada2e6604","Type":"ContainerStarted","Data":"bed10f845ae2562ed615debb306588d19e67ac5fde0bd6e5803f5e73f24ad9ed"} Apr 17 16:41:50.951961 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:50.951928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerStarted","Data":"edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1"} Apr 17 16:41:55.972838 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:55.972797 2572 generic.go:358] "Generic (PLEG): container finished" podID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerID="edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1" exitCode=0 Apr 17 16:41:55.973279 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:55.972876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerDied","Data":"edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1"} Apr 17 16:41:55.974379 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:55.974360 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:41:59.988870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:59.988126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" event={"ID":"cca92b8b-ec77-44db-88bc-3cbada2e6604","Type":"ContainerStarted","Data":"5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd"} Apr 17 16:41:59.990156 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:41:59.990129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" event={"ID":"52b1d4cc-638e-41b1-9fdb-63687ef13969","Type":"ContainerStarted","Data":"e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac"} Apr 17 16:42:04.005106 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:04.005065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" event={"ID":"52b1d4cc-638e-41b1-9fdb-63687ef13969","Type":"ContainerStarted","Data":"beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b"} Apr 17 16:42:04.005564 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:04.005302 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:42:04.007024 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:04.006992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" event={"ID":"cca92b8b-ec77-44db-88bc-3cbada2e6604","Type":"ContainerStarted","Data":"378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d"} Apr 17 16:42:04.007167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:04.007150 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:42:04.022502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:04.021816 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podStartSLOduration=1.142238167 podStartE2EDuration="20.02180217s" podCreationTimestamp="2026-04-17 16:41:44 +0000 UTC" firstStartedPulling="2026-04-17 16:41:44.649739651 +0000 UTC m=+637.998369184" lastFinishedPulling="2026-04-17 16:42:03.529303656 +0000 UTC m=+656.877933187" observedRunningTime="2026-04-17 16:42:04.020773653 +0000 UTC m=+657.369403203" watchObservedRunningTime="2026-04-17 16:42:04.02180217 +0000 UTC m=+657.370431721" Apr 17 16:42:04.036777 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:04.036452 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podStartSLOduration=1.6074953939999999 podStartE2EDuration="20.036436184s" podCreationTimestamp="2026-04-17 16:41:44 +0000 UTC" firstStartedPulling="2026-04-17 16:41:45.101077254 +0000 UTC m=+638.449706786" lastFinishedPulling="2026-04-17 16:42:03.530018036 +0000 UTC m=+656.878647576" observedRunningTime="2026-04-17 16:42:04.035822713 +0000 UTC m=+657.384452274" watchObservedRunningTime="2026-04-17 16:42:04.036436184 +0000 UTC m=+657.385065736" Apr 17 16:42:05.009694 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:05.009641 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:42:05.010165 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:05.009882 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:42:05.011156 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:05.011093 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:42:05.011278 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:05.011093 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:42:06.013341 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:06.013300 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:42:06.013796 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:06.013410 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:42:11.018717 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:11.018682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:42:11.019231 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:11.018761 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:42:11.019322 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:11.019292 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:42:11.019436 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:11.019394 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:42:19.054866 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:19.054805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerStarted","Data":"1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5"} Apr 17 16:42:19.054866 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:19.054839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerStarted","Data":"a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8"} Apr 17 16:42:20.058102 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:20.058063 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:42:20.058513 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:20.058211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:42:20.059456 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:20.059430 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:42:20.075964 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:20.075922 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podStartSLOduration=2.867647079 podStartE2EDuration="36.075909759s" podCreationTimestamp="2026-04-17 16:41:44 +0000 UTC" firstStartedPulling="2026-04-17 16:41:45.589057644 +0000 UTC m=+638.937687179" lastFinishedPulling="2026-04-17 16:42:18.797320329 +0000 UTC m=+672.145949859" observedRunningTime="2026-04-17 16:42:20.074108191 +0000 UTC m=+673.422737739" watchObservedRunningTime="2026-04-17 16:42:20.075909759 +0000 UTC m=+673.424539309" Apr 17 16:42:21.019660 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:21.019599 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:42:21.019660 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:21.019623 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:42:21.060591 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:21.060557 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:42:26.065411 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:26.065381 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:42:26.065970 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:26.065946 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:42:31.019933 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:31.019879 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:42:31.020490 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:31.019886 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:42:36.066899 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:36.066856 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:42:41.020011 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:41.019965 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:42:41.020839 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:41.020013 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:42:46.066436 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:46.066394 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:42:51.020741 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:51.020710 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:42:51.021176 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:51.020762 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:42:56.066134 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:42:56.066056 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:43:04.064091 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.064051 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb"] Apr 17 16:43:04.067234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.067217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.069491 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.069462 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9e0ff-kube-rbac-proxy-sar-config\"" Apr 17 16:43:04.069491 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.069477 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9e0ff-serving-cert\"" Apr 17 16:43:04.076133 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.076107 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb"] Apr 17 16:43:04.182857 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.182827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.183018 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.182888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-openshift-service-ca-bundle\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.283751 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.283710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.283918 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.283796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-openshift-service-ca-bundle\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.283918 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:04.283845 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-9e0ff-serving-cert: secret "switch-graph-9e0ff-serving-cert" not found Apr 17 16:43:04.283918 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:04.283914 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls podName:1e7495f8-7dea-426d-94a0-0d3ccc87fbee nodeName:}" failed. No retries permitted until 2026-04-17 16:43:04.783898801 +0000 UTC m=+718.132528327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls") pod "switch-graph-9e0ff-7bdf4d4867-lckwb" (UID: "1e7495f8-7dea-426d-94a0-0d3ccc87fbee") : secret "switch-graph-9e0ff-serving-cert" not found Apr 17 16:43:04.284473 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.284456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-openshift-service-ca-bundle\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.788285 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.788240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.790691 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.790644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls\") pod \"switch-graph-9e0ff-7bdf4d4867-lckwb\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:04.977838 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:04.977803 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:05.094199 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:05.094171 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb"] Apr 17 16:43:05.096363 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:43:05.096338 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7495f8_7dea_426d_94a0_0d3ccc87fbee.slice/crio-da52b2a87751324e6fdf39d62ab5ca7a0a3c14904e5c6b5255850a10a6b521b5 WatchSource:0}: Error finding container da52b2a87751324e6fdf39d62ab5ca7a0a3c14904e5c6b5255850a10a6b521b5: Status 404 returned error can't find the container with id da52b2a87751324e6fdf39d62ab5ca7a0a3c14904e5c6b5255850a10a6b521b5 Apr 17 16:43:05.186512 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:05.186471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" event={"ID":"1e7495f8-7dea-426d-94a0-0d3ccc87fbee","Type":"ContainerStarted","Data":"da52b2a87751324e6fdf39d62ab5ca7a0a3c14904e5c6b5255850a10a6b521b5"} Apr 17 16:43:06.065916 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:06.065872 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:43:08.196529 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:08.196496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" event={"ID":"1e7495f8-7dea-426d-94a0-0d3ccc87fbee","Type":"ContainerStarted","Data":"ba9008ac71d5e965cb34c43e98f52cf43eb439e6c6da6f1bf47d16db5ac565a5"} Apr 17 16:43:08.196942 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:08.196568 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:08.211532 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:08.211490 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podStartSLOduration=1.995245159 podStartE2EDuration="4.211476887s" podCreationTimestamp="2026-04-17 16:43:04 +0000 UTC" firstStartedPulling="2026-04-17 16:43:05.09817349 +0000 UTC m=+718.446803018" lastFinishedPulling="2026-04-17 16:43:07.314405199 +0000 UTC m=+720.663034746" observedRunningTime="2026-04-17 16:43:08.210717471 +0000 UTC m=+721.559347018" watchObservedRunningTime="2026-04-17 16:43:08.211476887 +0000 UTC m=+721.560106437" Apr 17 16:43:14.204884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:14.204853 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:16.066718 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:16.066678 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:43:18.311258 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.311213 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb"] Apr 17 16:43:18.311661 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.311429 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" containerID="cri-o://ba9008ac71d5e965cb34c43e98f52cf43eb439e6c6da6f1bf47d16db5ac565a5" gracePeriod=30 Apr 17 16:43:18.426068 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.426037 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz"] Apr 17 16:43:18.426430 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.426397 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" containerID="cri-o://e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac" gracePeriod=30 Apr 17 16:43:18.426561 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.426449 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kube-rbac-proxy" containerID="cri-o://beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b" gracePeriod=30 Apr 17 16:43:18.493884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.493852 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l"] Apr 17 16:43:18.494178 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.494151 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" containerID="cri-o://5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd" gracePeriod=30 Apr 17 16:43:18.494271 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.494200 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kube-rbac-proxy" containerID="cri-o://378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d" gracePeriod=30 Apr 17 16:43:18.507047 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.507023 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr"] Apr 17 16:43:18.510968 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.510948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.513135 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.513113 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\"" Apr 17 16:43:18.513379 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.513360 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ce4d4-predictor-serving-cert\"" Apr 17 16:43:18.527254 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.525382 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr"] Apr 17 16:43:18.577991 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.577909 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh"] Apr 17 16:43:18.581529 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.581508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.583687 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.583633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\"" Apr 17 16:43:18.583795 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.583702 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ce4d4-predictor-serving-cert\"" Apr 17 16:43:18.591325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.591304 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh"] Apr 17 16:43:18.598669 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.598624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.598774 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.598683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqd96\" (UniqueName: \"kubernetes.io/projected/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-kube-api-access-wqd96\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.598838 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.598770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-proxy-tls\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.700038 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.700038 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqd96\" (UniqueName: \"kubernetes.io/projected/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-kube-api-access-wqd96\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.700267 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-proxy-tls\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.700267 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c530e539-fe55-47db-ae30-3447fd68f304-proxy-tls\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.700267 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c530e539-fe55-47db-ae30-3447fd68f304-error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.700267 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtkc\" (UniqueName: \"kubernetes.io/projected/c530e539-fe55-47db-ae30-3447fd68f304-kube-api-access-zrtkc\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.700790 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.700769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.702768 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.702745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-proxy-tls\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.707982 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.707957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqd96\" (UniqueName: \"kubernetes.io/projected/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-kube-api-access-wqd96\") pod \"success-200-isvc-ce4d4-predictor-5d646b757d-msdzr\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.801272 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.801239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c530e539-fe55-47db-ae30-3447fd68f304-proxy-tls\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.801432 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.801279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c530e539-fe55-47db-ae30-3447fd68f304-error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.801432 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.801304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtkc\" (UniqueName: \"kubernetes.io/projected/c530e539-fe55-47db-ae30-3447fd68f304-kube-api-access-zrtkc\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.801950 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.801920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c530e539-fe55-47db-ae30-3447fd68f304-error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.803628 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.803609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c530e539-fe55-47db-ae30-3447fd68f304-proxy-tls\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.809212 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.809180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtkc\" (UniqueName: \"kubernetes.io/projected/c530e539-fe55-47db-ae30-3447fd68f304-kube-api-access-zrtkc\") pod \"error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.828901 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.828839 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:18.893752 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.893713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:18.957717 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:18.957636 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr"] Apr 17 16:43:18.961543 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:43:18.961516 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb08b81ef_3e96_4cd5_ac27_13a36d7be0c6.slice/crio-054b7a81abae9acfebf09ebe6b3b557eb1d01aa0b8b33c8da95825b6b81afa90 WatchSource:0}: Error finding container 054b7a81abae9acfebf09ebe6b3b557eb1d01aa0b8b33c8da95825b6b81afa90: Status 404 returned error can't find the container with id 054b7a81abae9acfebf09ebe6b3b557eb1d01aa0b8b33c8da95825b6b81afa90 Apr 17 16:43:19.027572 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.027320 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh"] Apr 17 16:43:19.031155 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:43:19.031119 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc530e539_fe55_47db_ae30_3447fd68f304.slice/crio-2df9e9cf4bda669b3835601b6e75be16d6c4079052bc3598982e1cc11d500cd1 WatchSource:0}: Error finding container 2df9e9cf4bda669b3835601b6e75be16d6c4079052bc3598982e1cc11d500cd1: Status 404 returned error can't find the container with id 2df9e9cf4bda669b3835601b6e75be16d6c4079052bc3598982e1cc11d500cd1 Apr 17 16:43:19.203751 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.203717 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:19.230856 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.230821 2572 generic.go:358] "Generic (PLEG): container finished" podID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerID="beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b" exitCode=2 Apr 17 16:43:19.231029 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.230875 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" event={"ID":"52b1d4cc-638e-41b1-9fdb-63687ef13969","Type":"ContainerDied","Data":"beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b"} Apr 17 16:43:19.232964 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.232900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" event={"ID":"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6","Type":"ContainerStarted","Data":"54c54ec2a72c624c92fe1d1d4efdee1365e1badfd9862b8e4a93c438474a3e72"} Apr 17 16:43:19.232964 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.232954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" event={"ID":"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6","Type":"ContainerStarted","Data":"311ba39526df52d34918a90d0097de0f02525e31141dcb4025b27968db8da1ad"} Apr 17 16:43:19.233141 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.232970 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" event={"ID":"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6","Type":"ContainerStarted","Data":"054b7a81abae9acfebf09ebe6b3b557eb1d01aa0b8b33c8da95825b6b81afa90"} Apr 17 16:43:19.233230 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.233204 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:19.233459 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.233438 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:19.234907 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.234871 2572 generic.go:358] "Generic (PLEG): container finished" podID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerID="378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d" exitCode=2 Apr 17 16:43:19.234999 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.234902 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" event={"ID":"cca92b8b-ec77-44db-88bc-3cbada2e6604","Type":"ContainerDied","Data":"378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d"} Apr 17 16:43:19.234999 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.234967 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:19.236767 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.236744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" event={"ID":"c530e539-fe55-47db-ae30-3447fd68f304","Type":"ContainerStarted","Data":"80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990"} Apr 17 16:43:19.236870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.236770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" event={"ID":"c530e539-fe55-47db-ae30-3447fd68f304","Type":"ContainerStarted","Data":"f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff"} Apr 17 16:43:19.236870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.236783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" event={"ID":"c530e539-fe55-47db-ae30-3447fd68f304","Type":"ContainerStarted","Data":"2df9e9cf4bda669b3835601b6e75be16d6c4079052bc3598982e1cc11d500cd1"} Apr 17 16:43:19.237003 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.236872 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:19.249044 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.248991 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podStartSLOduration=1.248975932 podStartE2EDuration="1.248975932s" podCreationTimestamp="2026-04-17 16:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:19.248417534 +0000 UTC m=+732.597047087" watchObservedRunningTime="2026-04-17 16:43:19.248975932 +0000 UTC m=+732.597605485" Apr 17 16:43:19.265879 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:19.265828 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podStartSLOduration=1.265810096 podStartE2EDuration="1.265810096s" podCreationTimestamp="2026-04-17 16:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:19.264583652 +0000 UTC m=+732.613213202" watchObservedRunningTime="2026-04-17 16:43:19.265810096 +0000 UTC m=+732.614439643" Apr 17 16:43:20.239199 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:20.239155 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:20.239677 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:20.239330 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:20.240704 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:20.240676 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:43:21.013979 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.013935 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 17 16:43:21.013979 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.013939 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.20:8643/healthz\": dial tcp 10.132.0.20:8643: connect: connection refused" Apr 17 16:43:21.019271 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.019238 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 17 16:43:21.019417 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.019247 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 16:43:21.241973 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.241931 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:21.242365 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.242005 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:43:21.676611 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.676580 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:43:21.722197 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.722167 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52b1d4cc-638e-41b1-9fdb-63687ef13969-success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"52b1d4cc-638e-41b1-9fdb-63687ef13969\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " Apr 17 16:43:21.722594 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.722279 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch42f\" (UniqueName: \"kubernetes.io/projected/52b1d4cc-638e-41b1-9fdb-63687ef13969-kube-api-access-ch42f\") pod \"52b1d4cc-638e-41b1-9fdb-63687ef13969\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " Apr 17 16:43:21.722594 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.722330 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52b1d4cc-638e-41b1-9fdb-63687ef13969-proxy-tls\") pod \"52b1d4cc-638e-41b1-9fdb-63687ef13969\" (UID: \"52b1d4cc-638e-41b1-9fdb-63687ef13969\") " Apr 17 16:43:21.722742 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.722613 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52b1d4cc-638e-41b1-9fdb-63687ef13969-success-200-isvc-9e0ff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-9e0ff-kube-rbac-proxy-sar-config") pod "52b1d4cc-638e-41b1-9fdb-63687ef13969" (UID: "52b1d4cc-638e-41b1-9fdb-63687ef13969"). InnerVolumeSpecName "success-200-isvc-9e0ff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:21.724628 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.724597 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b1d4cc-638e-41b1-9fdb-63687ef13969-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "52b1d4cc-638e-41b1-9fdb-63687ef13969" (UID: "52b1d4cc-638e-41b1-9fdb-63687ef13969"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:21.724987 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.724961 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b1d4cc-638e-41b1-9fdb-63687ef13969-kube-api-access-ch42f" (OuterVolumeSpecName: "kube-api-access-ch42f") pod "52b1d4cc-638e-41b1-9fdb-63687ef13969" (UID: "52b1d4cc-638e-41b1-9fdb-63687ef13969"). InnerVolumeSpecName "kube-api-access-ch42f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:21.823072 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.823041 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ch42f\" (UniqueName: \"kubernetes.io/projected/52b1d4cc-638e-41b1-9fdb-63687ef13969-kube-api-access-ch42f\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:21.823195 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.823075 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52b1d4cc-638e-41b1-9fdb-63687ef13969-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:21.823195 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.823092 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52b1d4cc-638e-41b1-9fdb-63687ef13969-success-200-isvc-9e0ff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:21.913941 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:21.913919 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:43:22.024939 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.024906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mj82\" (UniqueName: \"kubernetes.io/projected/cca92b8b-ec77-44db-88bc-3cbada2e6604-kube-api-access-2mj82\") pod \"cca92b8b-ec77-44db-88bc-3cbada2e6604\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " Apr 17 16:43:22.025120 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.024951 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca92b8b-ec77-44db-88bc-3cbada2e6604-proxy-tls\") pod \"cca92b8b-ec77-44db-88bc-3cbada2e6604\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " Apr 17 16:43:22.025120 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.025021 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cca92b8b-ec77-44db-88bc-3cbada2e6604-error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\") pod \"cca92b8b-ec77-44db-88bc-3cbada2e6604\" (UID: \"cca92b8b-ec77-44db-88bc-3cbada2e6604\") " Apr 17 16:43:22.025410 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.025386 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca92b8b-ec77-44db-88bc-3cbada2e6604-error-404-isvc-9e0ff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-9e0ff-kube-rbac-proxy-sar-config") pod "cca92b8b-ec77-44db-88bc-3cbada2e6604" (UID: "cca92b8b-ec77-44db-88bc-3cbada2e6604"). InnerVolumeSpecName "error-404-isvc-9e0ff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:22.026970 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.026941 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca92b8b-ec77-44db-88bc-3cbada2e6604-kube-api-access-2mj82" (OuterVolumeSpecName: "kube-api-access-2mj82") pod "cca92b8b-ec77-44db-88bc-3cbada2e6604" (UID: "cca92b8b-ec77-44db-88bc-3cbada2e6604"). InnerVolumeSpecName "kube-api-access-2mj82". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:22.027104 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.027075 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca92b8b-ec77-44db-88bc-3cbada2e6604-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cca92b8b-ec77-44db-88bc-3cbada2e6604" (UID: "cca92b8b-ec77-44db-88bc-3cbada2e6604"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:22.126122 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.126024 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cca92b8b-ec77-44db-88bc-3cbada2e6604-error-404-isvc-9e0ff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:22.126122 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.126067 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mj82\" (UniqueName: \"kubernetes.io/projected/cca92b8b-ec77-44db-88bc-3cbada2e6604-kube-api-access-2mj82\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:22.126122 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.126097 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca92b8b-ec77-44db-88bc-3cbada2e6604-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:22.245588 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.245547 2572 generic.go:358] "Generic (PLEG): container finished" podID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerID="5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd" exitCode=0 Apr 17 16:43:22.246017 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.245637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" event={"ID":"cca92b8b-ec77-44db-88bc-3cbada2e6604","Type":"ContainerDied","Data":"5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd"} Apr 17 16:43:22.246017 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.245639 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" Apr 17 16:43:22.246017 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.245701 2572 scope.go:117] "RemoveContainer" containerID="378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d" Apr 17 16:43:22.246017 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.245690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l" event={"ID":"cca92b8b-ec77-44db-88bc-3cbada2e6604","Type":"ContainerDied","Data":"bed10f845ae2562ed615debb306588d19e67ac5fde0bd6e5803f5e73f24ad9ed"} Apr 17 16:43:22.247085 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.247053 2572 generic.go:358] "Generic (PLEG): container finished" podID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerID="e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac" exitCode=0 Apr 17 16:43:22.247177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.247118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" event={"ID":"52b1d4cc-638e-41b1-9fdb-63687ef13969","Type":"ContainerDied","Data":"e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac"} Apr 17 16:43:22.247177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.247139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" event={"ID":"52b1d4cc-638e-41b1-9fdb-63687ef13969","Type":"ContainerDied","Data":"9032333b71eaa3e4e1715d3448b92df152867ff5b7235aef609585d570d107e3"} Apr 17 16:43:22.247177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.247160 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz" Apr 17 16:43:22.255359 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.255339 2572 scope.go:117] "RemoveContainer" containerID="5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd" Apr 17 16:43:22.262508 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.262489 2572 scope.go:117] "RemoveContainer" containerID="378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d" Apr 17 16:43:22.262754 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:22.262736 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d\": container with ID starting with 378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d not found: ID does not exist" containerID="378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d" Apr 17 16:43:22.262809 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.262760 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d"} err="failed to get container status \"378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d\": rpc error: code = NotFound desc = could not find container \"378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d\": container with ID starting with 378f75a757c6f9128cb1afc01d45ba3da937191261618603ac10706f208a2e3d not found: ID does not exist" Apr 17 16:43:22.262809 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.262776 2572 scope.go:117] "RemoveContainer" containerID="5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd" Apr 17 16:43:22.263003 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:22.262986 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd\": container with ID starting with 5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd not found: ID does not exist" containerID="5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd" Apr 17 16:43:22.263062 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.263012 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd"} err="failed to get container status \"5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd\": rpc error: code = NotFound desc = could not find container \"5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd\": container with ID starting with 5929438717f1c9dbec64a82805e0b63aee30093f6de6a42bf605939f294f76bd not found: ID does not exist" Apr 17 16:43:22.263062 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.263035 2572 scope.go:117] "RemoveContainer" containerID="beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b" Apr 17 16:43:22.266843 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.266822 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l"] Apr 17 16:43:22.270441 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.270419 2572 scope.go:117] "RemoveContainer" containerID="e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac" Apr 17 16:43:22.270529 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.270511 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l"] Apr 17 16:43:22.276957 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.276942 2572 scope.go:117] "RemoveContainer" containerID="beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b" Apr 17 16:43:22.277195 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:22.277176 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b\": container with ID starting with beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b not found: ID does not exist" containerID="beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b" Apr 17 16:43:22.277240 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.277201 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b"} err="failed to get container status \"beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b\": rpc error: code = NotFound desc = could not find container \"beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b\": container with ID starting with beb2e1f09066880903555783355e35a5cd734f39729ee32a3d8c88c9e5fe7c4b not found: ID does not exist" Apr 17 16:43:22.277240 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.277218 2572 scope.go:117] "RemoveContainer" containerID="e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac" Apr 17 16:43:22.277450 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:22.277434 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac\": container with ID starting with e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac not found: ID does not exist" containerID="e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac" Apr 17 16:43:22.277492 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.277457 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac"} err="failed to get container status \"e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac\": rpc error: code = NotFound desc = could not find container \"e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac\": container with ID starting with e4e7c62586dd3071768f301b7564929f7181e02c53aa8d346f658fe96f0e16ac not found: ID does not exist" Apr 17 16:43:22.282782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.282764 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz"] Apr 17 16:43:22.288760 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:22.288739 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz"] Apr 17 16:43:23.227239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:23.227210 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" path="/var/lib/kubelet/pods/52b1d4cc-638e-41b1-9fdb-63687ef13969/volumes" Apr 17 16:43:23.227609 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:23.227596 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" path="/var/lib/kubelet/pods/cca92b8b-ec77-44db-88bc-3cbada2e6604/volumes" Apr 17 16:43:24.203840 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:24.203807 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:26.066835 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:26.066805 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:43:26.246818 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:26.246468 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:43:26.247019 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:26.246999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:43:26.247511 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:26.247474 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:43:26.247811 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:26.247787 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:29.203229 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:29.203182 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:29.203594 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:29.203292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:34.203958 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:34.203916 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:36.248519 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:36.248477 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:36.248907 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:36.248489 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:43:39.203863 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:39.203825 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:44.030809 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.030775 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8"] Apr 17 16:43:44.031259 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031241 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" Apr 17 16:43:44.031325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031262 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" Apr 17 16:43:44.031325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031279 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kube-rbac-proxy" Apr 17 16:43:44.031325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031288 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kube-rbac-proxy" Apr 17 16:43:44.031325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031302 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" Apr 17 16:43:44.031325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031311 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" Apr 17 16:43:44.031325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031321 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kube-rbac-proxy" Apr 17 16:43:44.031682 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031330 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kube-rbac-proxy" Apr 17 16:43:44.031682 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031399 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kserve-container" Apr 17 16:43:44.031682 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031415 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cca92b8b-ec77-44db-88bc-3cbada2e6604" containerName="kube-rbac-proxy" Apr 17 16:43:44.031682 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031425 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kube-rbac-proxy" Apr 17 16:43:44.031682 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.031437 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="52b1d4cc-638e-41b1-9fdb-63687ef13969" containerName="kserve-container" Apr 17 16:43:44.037388 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.037366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.039921 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.039898 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 17 16:43:44.040193 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.040174 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 17 16:43:44.041468 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.041447 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8"] Apr 17 16:43:44.096770 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.096730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671eac1a-ae21-4662-83f9-a14856f158a1-openshift-service-ca-bundle\") pod \"model-chainer-86b99c7c88-lngx8\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.096897 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.096785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/671eac1a-ae21-4662-83f9-a14856f158a1-proxy-tls\") pod \"model-chainer-86b99c7c88-lngx8\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.197576 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.197541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671eac1a-ae21-4662-83f9-a14856f158a1-openshift-service-ca-bundle\") pod \"model-chainer-86b99c7c88-lngx8\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.197781 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.197585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/671eac1a-ae21-4662-83f9-a14856f158a1-proxy-tls\") pod \"model-chainer-86b99c7c88-lngx8\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.198196 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.198172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671eac1a-ae21-4662-83f9-a14856f158a1-openshift-service-ca-bundle\") pod \"model-chainer-86b99c7c88-lngx8\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.200031 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.200009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/671eac1a-ae21-4662-83f9-a14856f158a1-proxy-tls\") pod \"model-chainer-86b99c7c88-lngx8\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.203476 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.203439 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:44.348926 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.348846 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:44.464423 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:44.464398 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8"] Apr 17 16:43:44.466685 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:43:44.466655 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671eac1a_ae21_4662_83f9_a14856f158a1.slice/crio-14fe2e48485c6dc5fb03de9be9932f1ec6b1cece6eb2ab322ea69131777cdb97 WatchSource:0}: Error finding container 14fe2e48485c6dc5fb03de9be9932f1ec6b1cece6eb2ab322ea69131777cdb97: Status 404 returned error can't find the container with id 14fe2e48485c6dc5fb03de9be9932f1ec6b1cece6eb2ab322ea69131777cdb97 Apr 17 16:43:45.319335 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:45.319299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" event={"ID":"671eac1a-ae21-4662-83f9-a14856f158a1","Type":"ContainerStarted","Data":"34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c"} Apr 17 16:43:45.319335 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:45.319339 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" event={"ID":"671eac1a-ae21-4662-83f9-a14856f158a1","Type":"ContainerStarted","Data":"14fe2e48485c6dc5fb03de9be9932f1ec6b1cece6eb2ab322ea69131777cdb97"} Apr 17 16:43:45.319761 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:45.319397 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:45.334672 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:45.334605 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podStartSLOduration=1.3345878 podStartE2EDuration="1.3345878s" podCreationTimestamp="2026-04-17 16:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:45.334091756 +0000 UTC m=+758.682721316" watchObservedRunningTime="2026-04-17 16:43:45.3345878 +0000 UTC m=+758.683217349" Apr 17 16:43:46.248157 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:46.248118 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:43:46.248327 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:46.248123 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:48.329379 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.329352 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerID="ba9008ac71d5e965cb34c43e98f52cf43eb439e6c6da6f1bf47d16db5ac565a5" exitCode=0 Apr 17 16:43:48.329636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.329396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" event={"ID":"1e7495f8-7dea-426d-94a0-0d3ccc87fbee","Type":"ContainerDied","Data":"ba9008ac71d5e965cb34c43e98f52cf43eb439e6c6da6f1bf47d16db5ac565a5"} Apr 17 16:43:48.450382 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.450358 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:48.534011 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.533979 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-openshift-service-ca-bundle\") pod \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " Apr 17 16:43:48.534167 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.534015 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls\") pod \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\" (UID: \"1e7495f8-7dea-426d-94a0-0d3ccc87fbee\") " Apr 17 16:43:48.534335 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.534310 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "1e7495f8-7dea-426d-94a0-0d3ccc87fbee" (UID: "1e7495f8-7dea-426d-94a0-0d3ccc87fbee"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:48.535994 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.535974 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1e7495f8-7dea-426d-94a0-0d3ccc87fbee" (UID: "1e7495f8-7dea-426d-94a0-0d3ccc87fbee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:48.635143 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.635067 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:48.635143 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:48.635095 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e7495f8-7dea-426d-94a0-0d3ccc87fbee-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:49.337926 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:49.337891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" event={"ID":"1e7495f8-7dea-426d-94a0-0d3ccc87fbee","Type":"ContainerDied","Data":"da52b2a87751324e6fdf39d62ab5ca7a0a3c14904e5c6b5255850a10a6b521b5"} Apr 17 16:43:49.338363 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:49.337943 2572 scope.go:117] "RemoveContainer" containerID="ba9008ac71d5e965cb34c43e98f52cf43eb439e6c6da6f1bf47d16db5ac565a5" Apr 17 16:43:49.338363 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:49.337962 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb" Apr 17 16:43:49.353355 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:49.353331 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb"] Apr 17 16:43:49.357299 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:49.357274 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb"] Apr 17 16:43:51.227189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:51.227148 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" path="/var/lib/kubelet/pods/1e7495f8-7dea-426d-94a0-0d3ccc87fbee/volumes" Apr 17 16:43:51.328424 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:51.328390 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:43:54.217585 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.217553 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8"] Apr 17 16:43:54.218028 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.217768 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" containerID="cri-o://34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c" gracePeriod=30 Apr 17 16:43:54.287082 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.287048 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv"] Apr 17 16:43:54.287404 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.287377 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" containerID="cri-o://a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8" gracePeriod=30 Apr 17 16:43:54.287491 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.287469 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kube-rbac-proxy" containerID="cri-o://1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5" gracePeriod=30 Apr 17 16:43:54.380472 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.380435 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm"] Apr 17 16:43:54.380788 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.380775 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" Apr 17 16:43:54.380788 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.380789 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" Apr 17 16:43:54.380904 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.380842 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e7495f8-7dea-426d-94a0-0d3ccc87fbee" containerName="switch-graph-9e0ff" Apr 17 16:43:54.385273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.385256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.387323 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.387304 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8b822-kube-rbac-proxy-sar-config\"" Apr 17 16:43:54.387514 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.387501 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8b822-predictor-serving-cert\"" Apr 17 16:43:54.396711 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.396685 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm"] Apr 17 16:43:54.482753 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.482679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/432dc8ff-e1ad-414b-a836-0f16a600f03e-success-200-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.482753 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.482736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/432dc8ff-e1ad-414b-a836-0f16a600f03e-proxy-tls\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.483189 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.482791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4h84\" (UniqueName: \"kubernetes.io/projected/432dc8ff-e1ad-414b-a836-0f16a600f03e-kube-api-access-k4h84\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.497769 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.496718 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5"] Apr 17 16:43:54.501780 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.501756 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.505122 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.505096 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8b822-predictor-serving-cert\"" Apr 17 16:43:54.505267 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.505246 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8b822-kube-rbac-proxy-sar-config\"" Apr 17 16:43:54.509888 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.509498 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5"] Apr 17 16:43:54.584214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584177 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/432dc8ff-e1ad-414b-a836-0f16a600f03e-success-200-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.584396 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/432dc8ff-e1ad-414b-a836-0f16a600f03e-proxy-tls\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.584396 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.584396 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4h84\" (UniqueName: \"kubernetes.io/projected/432dc8ff-e1ad-414b-a836-0f16a600f03e-kube-api-access-k4h84\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.584396 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dk88\" (UniqueName: \"kubernetes.io/projected/c1417668-656a-445d-89b2-53c582a08559-kube-api-access-7dk88\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.584574 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1417668-656a-445d-89b2-53c582a08559-error-404-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.584884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.584851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/432dc8ff-e1ad-414b-a836-0f16a600f03e-success-200-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.586744 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.586722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/432dc8ff-e1ad-414b-a836-0f16a600f03e-proxy-tls\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.591589 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.591563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4h84\" (UniqueName: \"kubernetes.io/projected/432dc8ff-e1ad-414b-a836-0f16a600f03e-kube-api-access-k4h84\") pod \"success-200-isvc-8b822-predictor-74448fdcdb-6sqxm\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.685042 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.685015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.685187 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.685057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dk88\" (UniqueName: \"kubernetes.io/projected/c1417668-656a-445d-89b2-53c582a08559-kube-api-access-7dk88\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.685187 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.685081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1417668-656a-445d-89b2-53c582a08559-error-404-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.685187 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:54.685154 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-8b822-predictor-serving-cert: secret "error-404-isvc-8b822-predictor-serving-cert" not found Apr 17 16:43:54.685322 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:54.685235 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls podName:c1417668-656a-445d-89b2-53c582a08559 nodeName:}" failed. No retries permitted until 2026-04-17 16:43:55.185213388 +0000 UTC m=+768.533842915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls") pod "error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" (UID: "c1417668-656a-445d-89b2-53c582a08559") : secret "error-404-isvc-8b822-predictor-serving-cert" not found Apr 17 16:43:54.685680 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.685639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1417668-656a-445d-89b2-53c582a08559-error-404-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.693751 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.693729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dk88\" (UniqueName: \"kubernetes.io/projected/c1417668-656a-445d-89b2-53c582a08559-kube-api-access-7dk88\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:54.697620 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.697603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:54.819475 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:54.819436 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm"] Apr 17 16:43:54.823172 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:43:54.823139 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432dc8ff_e1ad_414b_a836_0f16a600f03e.slice/crio-223697c1e294b645fefdc43904080be84eb3095e30d2076e9cea3946eb61184a WatchSource:0}: Error finding container 223697c1e294b645fefdc43904080be84eb3095e30d2076e9cea3946eb61184a: Status 404 returned error can't find the container with id 223697c1e294b645fefdc43904080be84eb3095e30d2076e9cea3946eb61184a Apr 17 16:43:55.190010 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.189919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:55.192300 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.192272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls\") pod \"error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:55.357406 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.357375 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" event={"ID":"432dc8ff-e1ad-414b-a836-0f16a600f03e","Type":"ContainerStarted","Data":"fbc4ca633b639a8179f4d47b7c6130e98da382f682078a038c98d918c88a967b"} Apr 17 16:43:55.357887 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.357413 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" event={"ID":"432dc8ff-e1ad-414b-a836-0f16a600f03e","Type":"ContainerStarted","Data":"eddd6918a40f9814d394aa4e337367964506b3736e75550981aa9d4fe72afbe6"} Apr 17 16:43:55.357887 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.357426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" event={"ID":"432dc8ff-e1ad-414b-a836-0f16a600f03e","Type":"ContainerStarted","Data":"223697c1e294b645fefdc43904080be84eb3095e30d2076e9cea3946eb61184a"} Apr 17 16:43:55.357887 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.357558 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:55.357887 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.357587 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:43:55.359009 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.358987 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:43:55.359366 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.359347 2572 generic.go:358] "Generic (PLEG): container finished" podID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerID="1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5" exitCode=2 Apr 17 16:43:55.359433 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.359414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerDied","Data":"1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5"} Apr 17 16:43:55.375248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.375210 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podStartSLOduration=1.375196972 podStartE2EDuration="1.375196972s" podCreationTimestamp="2026-04-17 16:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:55.373982129 +0000 UTC m=+768.722611677" watchObservedRunningTime="2026-04-17 16:43:55.375196972 +0000 UTC m=+768.723826521" Apr 17 16:43:55.414118 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.414095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:55.537823 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:55.537788 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5"] Apr 17 16:43:55.540804 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:43:55.540782 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1417668_656a_445d_89b2_53c582a08559.slice/crio-1f276f264b0f95aa5b855b4f1e24902c1d5beca81b871515bcb67fdb18bba7ec WatchSource:0}: Error finding container 1f276f264b0f95aa5b855b4f1e24902c1d5beca81b871515bcb67fdb18bba7ec: Status 404 returned error can't find the container with id 1f276f264b0f95aa5b855b4f1e24902c1d5beca81b871515bcb67fdb18bba7ec Apr 17 16:43:56.061053 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.061015 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 17 16:43:56.066333 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.066302 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 17 16:43:56.247506 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.247468 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:43:56.247802 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.247779 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:43:56.328622 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.328527 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:56.364501 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.364467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" event={"ID":"c1417668-656a-445d-89b2-53c582a08559","Type":"ContainerStarted","Data":"d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275"} Apr 17 16:43:56.364501 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.364507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" event={"ID":"c1417668-656a-445d-89b2-53c582a08559","Type":"ContainerStarted","Data":"f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4"} Apr 17 16:43:56.365006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.364518 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" event={"ID":"c1417668-656a-445d-89b2-53c582a08559","Type":"ContainerStarted","Data":"1f276f264b0f95aa5b855b4f1e24902c1d5beca81b871515bcb67fdb18bba7ec"} Apr 17 16:43:56.365006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.364670 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:56.365006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.364744 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:43:56.383263 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:56.383204 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podStartSLOduration=2.3831863269999998 podStartE2EDuration="2.383186327s" podCreationTimestamp="2026-04-17 16:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:56.381086156 +0000 UTC m=+769.729715705" watchObservedRunningTime="2026-04-17 16:43:56.383186327 +0000 UTC m=+769.731815876" Apr 17 16:43:57.367243 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:57.367211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:43:57.368347 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:57.368320 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:43:58.136422 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.136400 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:43:58.218559 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.218534 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nxrg\" (UniqueName: \"kubernetes.io/projected/2806963c-78df-401c-b5e5-ab8166c86d7f-kube-api-access-8nxrg\") pod \"2806963c-78df-401c-b5e5-ab8166c86d7f\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " Apr 17 16:43:58.218740 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.218593 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2806963c-78df-401c-b5e5-ab8166c86d7f-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"2806963c-78df-401c-b5e5-ab8166c86d7f\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " Apr 17 16:43:58.218740 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.218674 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls\") pod \"2806963c-78df-401c-b5e5-ab8166c86d7f\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " Apr 17 16:43:58.218740 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.218729 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2806963c-78df-401c-b5e5-ab8166c86d7f-kserve-provision-location\") pod \"2806963c-78df-401c-b5e5-ab8166c86d7f\" (UID: \"2806963c-78df-401c-b5e5-ab8166c86d7f\") " Apr 17 16:43:58.219016 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.218990 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2806963c-78df-401c-b5e5-ab8166c86d7f-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "2806963c-78df-401c-b5e5-ab8166c86d7f" (UID: "2806963c-78df-401c-b5e5-ab8166c86d7f"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:58.219208 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.219182 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2806963c-78df-401c-b5e5-ab8166c86d7f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2806963c-78df-401c-b5e5-ab8166c86d7f" (UID: "2806963c-78df-401c-b5e5-ab8166c86d7f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:43:58.220898 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.220867 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2806963c-78df-401c-b5e5-ab8166c86d7f" (UID: "2806963c-78df-401c-b5e5-ab8166c86d7f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:58.221712 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.221685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2806963c-78df-401c-b5e5-ab8166c86d7f-kube-api-access-8nxrg" (OuterVolumeSpecName: "kube-api-access-8nxrg") pod "2806963c-78df-401c-b5e5-ab8166c86d7f" (UID: "2806963c-78df-401c-b5e5-ab8166c86d7f"). InnerVolumeSpecName "kube-api-access-8nxrg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:58.320132 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.320092 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2806963c-78df-401c-b5e5-ab8166c86d7f-kserve-provision-location\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.320132 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.320121 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nxrg\" (UniqueName: \"kubernetes.io/projected/2806963c-78df-401c-b5e5-ab8166c86d7f-kube-api-access-8nxrg\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.320132 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.320132 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2806963c-78df-401c-b5e5-ab8166c86d7f-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.320417 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.320143 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2806963c-78df-401c-b5e5-ab8166c86d7f-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.372694 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.372660 2572 generic.go:358] "Generic (PLEG): container finished" podID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerID="a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8" exitCode=0 Apr 17 16:43:58.373114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.372750 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" Apr 17 16:43:58.373114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.372742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerDied","Data":"a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8"} Apr 17 16:43:58.373114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.372861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv" event={"ID":"2806963c-78df-401c-b5e5-ab8166c86d7f","Type":"ContainerDied","Data":"4519e2b6201e92d33e1fa542003e076eabb121ea30ea85c6df5a51c504f9ec47"} Apr 17 16:43:58.373114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.372884 2572 scope.go:117] "RemoveContainer" containerID="1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5" Apr 17 16:43:58.373328 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.373308 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:43:58.381231 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.381216 2572 scope.go:117] "RemoveContainer" containerID="a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8" Apr 17 16:43:58.387763 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.387747 2572 scope.go:117] "RemoveContainer" containerID="edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1" Apr 17 16:43:58.393253 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.393233 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv"] Apr 17 16:43:58.394607 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.394596 2572 scope.go:117] "RemoveContainer" containerID="1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5" Apr 17 16:43:58.394949 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:58.394905 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5\": container with ID starting with 1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5 not found: ID does not exist" containerID="1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5" Apr 17 16:43:58.395029 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.394959 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5"} err="failed to get container status \"1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5\": rpc error: code = NotFound desc = could not find container \"1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5\": container with ID starting with 1d6a0b3c6c5c18d97a535dbfedef803c0452feb4a9321019f55b3e7cb0b127b5 not found: ID does not exist" Apr 17 16:43:58.395029 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.394978 2572 scope.go:117] "RemoveContainer" containerID="a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8" Apr 17 16:43:58.395241 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:58.395200 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8\": container with ID starting with a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8 not found: ID does not exist" containerID="a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8" Apr 17 16:43:58.395308 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.395250 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8"} err="failed to get container status \"a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8\": rpc error: code = NotFound desc = could not find container \"a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8\": container with ID starting with a6cecc7aa2a34ab682f4066cfc4cfdadf728b46e02562e929ed0739bef1a14d8 not found: ID does not exist" Apr 17 16:43:58.395308 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.395293 2572 scope.go:117] "RemoveContainer" containerID="edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1" Apr 17 16:43:58.395608 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:43:58.395576 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1\": container with ID starting with edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1 not found: ID does not exist" containerID="edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1" Apr 17 16:43:58.395770 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.395609 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1"} err="failed to get container status \"edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1\": rpc error: code = NotFound desc = could not find container \"edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1\": container with ID starting with edb56fe2417e0ead37179f31caefcd58aa4dc7e1ed4edbc28061dfdb0c26ffb1 not found: ID does not exist" Apr 17 16:43:58.397006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:58.396989 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv"] Apr 17 16:43:59.228705 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:43:59.228668 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" path="/var/lib/kubelet/pods/2806963c-78df-401c-b5e5-ab8166c86d7f/volumes" Apr 17 16:44:01.328523 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:01.328431 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:01.369450 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:01.369426 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:44:01.369976 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:01.369953 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:44:03.379185 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:03.379161 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:44:03.379570 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:03.379545 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:44:06.248860 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:06.248828 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:44:06.249234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:06.248931 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:44:06.326468 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:06.326423 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:06.326674 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:06.326520 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:44:11.327699 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:11.327640 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:11.370234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:11.370200 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:44:13.380060 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:13.380022 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:44:16.326700 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:16.326638 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:18.601664 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.601610 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6"] Apr 17 16:44:18.602114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602088 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="storage-initializer" Apr 17 16:44:18.602114 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602110 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="storage-initializer" Apr 17 16:44:18.602252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602130 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" Apr 17 16:44:18.602252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602139 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" Apr 17 16:44:18.602252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602150 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kube-rbac-proxy" Apr 17 16:44:18.602252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602157 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kube-rbac-proxy" Apr 17 16:44:18.602252 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602247 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kserve-container" Apr 17 16:44:18.602423 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.602259 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2806963c-78df-401c-b5e5-ab8166c86d7f" containerName="kube-rbac-proxy" Apr 17 16:44:18.606606 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.606585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:18.608938 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.608918 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-ce4d4-kube-rbac-proxy-sar-config\"" Apr 17 16:44:18.609046 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.608919 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-ce4d4-serving-cert\"" Apr 17 16:44:18.613286 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.613262 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6"] Apr 17 16:44:18.678088 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.678056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc24bb6-ef54-446f-b031-2149f3fe5158-openshift-service-ca-bundle\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:18.678249 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.678098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:18.779299 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.779266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:18.779460 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.779365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc24bb6-ef54-446f-b031-2149f3fe5158-openshift-service-ca-bundle\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:18.779460 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:44:18.779424 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:44:18.779534 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:44:18.779496 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:44:19.279475705 +0000 UTC m=+792.628105253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:44:18.780023 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:18.780005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc24bb6-ef54-446f-b031-2149f3fe5158-openshift-service-ca-bundle\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:19.282172 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:19.282140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:19.284403 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:19.284381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls\") pod \"switch-graph-ce4d4-66ccfb86c4-sd5b6\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:19.517484 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:19.517443 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:19.637509 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:19.637477 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6"] Apr 17 16:44:20.437718 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:20.437681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" event={"ID":"cbc24bb6-ef54-446f-b031-2149f3fe5158","Type":"ContainerStarted","Data":"1ce2fefde51a10b19e6c171106026a9c25d796f13e9a2cd530f669d7a3f3a852"} Apr 17 16:44:20.437884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:20.437725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" event={"ID":"cbc24bb6-ef54-446f-b031-2149f3fe5158","Type":"ContainerStarted","Data":"262b1f70156d480797c0dab474df5429735a171827867863724dad05b347567d"} Apr 17 16:44:20.437884 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:20.437779 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:20.454840 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:20.454800 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podStartSLOduration=2.454787183 podStartE2EDuration="2.454787183s" podCreationTimestamp="2026-04-17 16:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:44:20.452806203 +0000 UTC m=+793.801435753" watchObservedRunningTime="2026-04-17 16:44:20.454787183 +0000 UTC m=+793.803416731" Apr 17 16:44:21.326968 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:21.326926 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:21.369945 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:21.369916 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:44:23.380241 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:23.380195 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:44:24.253325 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:44:24.253286 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671eac1a_ae21_4662_83f9_a14856f158a1.slice/crio-conmon-34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671eac1a_ae21_4662_83f9_a14856f158a1.slice/crio-34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:44:24.253325 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:44:24.253307 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671eac1a_ae21_4662_83f9_a14856f158a1.slice/crio-conmon-34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671eac1a_ae21_4662_83f9_a14856f158a1.slice/crio-34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:44:24.368468 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.368440 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:44:24.423517 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.423476 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/671eac1a-ae21-4662-83f9-a14856f158a1-proxy-tls\") pod \"671eac1a-ae21-4662-83f9-a14856f158a1\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " Apr 17 16:44:24.424022 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.423548 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671eac1a-ae21-4662-83f9-a14856f158a1-openshift-service-ca-bundle\") pod \"671eac1a-ae21-4662-83f9-a14856f158a1\" (UID: \"671eac1a-ae21-4662-83f9-a14856f158a1\") " Apr 17 16:44:24.424022 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.423967 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671eac1a-ae21-4662-83f9-a14856f158a1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "671eac1a-ae21-4662-83f9-a14856f158a1" (UID: "671eac1a-ae21-4662-83f9-a14856f158a1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:44:24.425694 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.425664 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671eac1a-ae21-4662-83f9-a14856f158a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "671eac1a-ae21-4662-83f9-a14856f158a1" (UID: "671eac1a-ae21-4662-83f9-a14856f158a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:44:24.455061 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.455020 2572 generic.go:358] "Generic (PLEG): container finished" podID="671eac1a-ae21-4662-83f9-a14856f158a1" containerID="34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c" exitCode=0 Apr 17 16:44:24.455168 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.455108 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" Apr 17 16:44:24.455168 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.455107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" event={"ID":"671eac1a-ae21-4662-83f9-a14856f158a1","Type":"ContainerDied","Data":"34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c"} Apr 17 16:44:24.455168 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.455151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8" event={"ID":"671eac1a-ae21-4662-83f9-a14856f158a1","Type":"ContainerDied","Data":"14fe2e48485c6dc5fb03de9be9932f1ec6b1cece6eb2ab322ea69131777cdb97"} Apr 17 16:44:24.455279 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.455170 2572 scope.go:117] "RemoveContainer" containerID="34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c" Apr 17 16:44:24.463624 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.463608 2572 scope.go:117] "RemoveContainer" containerID="34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c" Apr 17 16:44:24.463989 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:44:24.463956 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c\": container with ID starting with 34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c not found: ID does not exist" containerID="34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c" Apr 17 16:44:24.464214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.463988 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c"} err="failed to get container status \"34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c\": rpc error: code = NotFound desc = could not find container \"34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c\": container with ID starting with 34602c8e65404f7db3a44ee67549523a4390268cc655ba2a08a1f625424fac3c not found: ID does not exist" Apr 17 16:44:24.477341 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.477286 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8"] Apr 17 16:44:24.483579 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.483558 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8"] Apr 17 16:44:24.524453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.524411 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671eac1a-ae21-4662-83f9-a14856f158a1-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:44:24.524453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:24.524445 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/671eac1a-ae21-4662-83f9-a14856f158a1-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:44:25.227106 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:25.227064 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" path="/var/lib/kubelet/pods/671eac1a-ae21-4662-83f9-a14856f158a1/volumes" Apr 17 16:44:26.445443 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:26.445417 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:44:31.370763 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:31.370719 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:44:33.379630 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:33.379591 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:44:41.370609 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:41.370572 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:44:43.380510 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:43.380478 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:44:54.370552 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.370519 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx"] Apr 17 16:44:54.371177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.370882 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" Apr 17 16:44:54.371177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.370895 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" Apr 17 16:44:54.371177 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.370945 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="671eac1a-ae21-4662-83f9-a14856f158a1" containerName="model-chainer" Apr 17 16:44:54.374072 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.374054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.376392 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.376364 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-8b822-serving-cert\"" Apr 17 16:44:54.376504 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.376443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-8b822-kube-rbac-proxy-sar-config\"" Apr 17 16:44:54.381478 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.381456 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx"] Apr 17 16:44:54.460554 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.460519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-openshift-service-ca-bundle\") pod \"sequence-graph-8b822-89658595d-trzvx\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.460754 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.460578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-proxy-tls\") pod \"sequence-graph-8b822-89658595d-trzvx\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.561755 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.561723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-openshift-service-ca-bundle\") pod \"sequence-graph-8b822-89658595d-trzvx\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.561911 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.561764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-proxy-tls\") pod \"sequence-graph-8b822-89658595d-trzvx\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.562340 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.562320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-openshift-service-ca-bundle\") pod \"sequence-graph-8b822-89658595d-trzvx\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.564052 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.564030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-proxy-tls\") pod \"sequence-graph-8b822-89658595d-trzvx\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.684505 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.684429 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:54.825907 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:54.825878 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx"] Apr 17 16:44:54.828720 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:44:54.828689 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod174c6520_09a4_48e5_a2b4_4ed9e1d3f556.slice/crio-5d008f6c8585b62e7fc7b28051081dd489d9fd77d07d98ae59c09837c2510d5d WatchSource:0}: Error finding container 5d008f6c8585b62e7fc7b28051081dd489d9fd77d07d98ae59c09837c2510d5d: Status 404 returned error can't find the container with id 5d008f6c8585b62e7fc7b28051081dd489d9fd77d07d98ae59c09837c2510d5d Apr 17 16:44:55.551945 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:55.551911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" event={"ID":"174c6520-09a4-48e5-a2b4-4ed9e1d3f556","Type":"ContainerStarted","Data":"f6eee4f43fbd2e3471937e1bebdc380068ffd9bc80f12453de0a8566635bcb55"} Apr 17 16:44:55.551945 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:55.551948 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" event={"ID":"174c6520-09a4-48e5-a2b4-4ed9e1d3f556","Type":"ContainerStarted","Data":"5d008f6c8585b62e7fc7b28051081dd489d9fd77d07d98ae59c09837c2510d5d"} Apr 17 16:44:55.552364 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:55.552020 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:44:55.568232 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:44:55.568187 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podStartSLOduration=1.568176115 podStartE2EDuration="1.568176115s" podCreationTimestamp="2026-04-17 16:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:44:55.566668717 +0000 UTC m=+828.915298260" watchObservedRunningTime="2026-04-17 16:44:55.568176115 +0000 UTC m=+828.916805665" Apr 17 16:45:01.561096 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:45:01.561060 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:52:33.277980 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.277945 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6"] Apr 17 16:52:33.321746 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.321713 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:33.321931 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.321800 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:33.821779096 +0000 UTC m=+1287.170408624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:33.404890 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.404857 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr"] Apr 17 16:52:33.405158 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.405132 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" containerID="cri-o://311ba39526df52d34918a90d0097de0f02525e31141dcb4025b27968db8da1ad" gracePeriod=30 Apr 17 16:52:33.405231 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.405191 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kube-rbac-proxy" containerID="cri-o://54c54ec2a72c624c92fe1d1d4efdee1365e1badfd9862b8e4a93c438474a3e72" gracePeriod=30 Apr 17 16:52:33.555918 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.555828 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh"] Apr 17 16:52:33.556156 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.556128 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" containerID="cri-o://f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff" gracePeriod=30 Apr 17 16:52:33.556251 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.556145 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kube-rbac-proxy" containerID="cri-o://80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990" gracePeriod=30 Apr 17 16:52:33.572277 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.572250 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b"] Apr 17 16:52:33.575537 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.575520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.577962 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.577939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ac602-predictor-serving-cert\"" Apr 17 16:52:33.578123 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.577946 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ac602-kube-rbac-proxy-sar-config\"" Apr 17 16:52:33.592238 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.592184 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b"] Apr 17 16:52:33.624415 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.624377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8eef08cd-81aa-4be2-9abc-b72a91020ae2-success-200-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.624584 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.624443 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.624584 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.624503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzpbx\" (UniqueName: \"kubernetes.io/projected/8eef08cd-81aa-4be2-9abc-b72a91020ae2-kube-api-access-qzpbx\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.635491 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.635458 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w"] Apr 17 16:52:33.638850 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.638828 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.641044 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.641019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ac602-kube-rbac-proxy-sar-config\"" Apr 17 16:52:33.641153 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.641130 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ac602-predictor-serving-cert\"" Apr 17 16:52:33.648091 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.648064 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w"] Apr 17 16:52:33.725309 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.725271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd4eaf4-3bae-47ab-853c-dc736ae40134-error-404-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.725309 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.725317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw76q\" (UniqueName: \"kubernetes.io/projected/2fd4eaf4-3bae-47ab-853c-dc736ae40134-kube-api-access-zw76q\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.725568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.725354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8eef08cd-81aa-4be2-9abc-b72a91020ae2-success-200-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.725568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.725451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.725568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.725513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzpbx\" (UniqueName: \"kubernetes.io/projected/8eef08cd-81aa-4be2-9abc-b72a91020ae2-kube-api-access-qzpbx\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.725748 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.725577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.725748 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.725633 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-ac602-predictor-serving-cert: secret "success-200-isvc-ac602-predictor-serving-cert" not found Apr 17 16:52:33.725748 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.725720 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls podName:8eef08cd-81aa-4be2-9abc-b72a91020ae2 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:34.225698565 +0000 UTC m=+1287.574328106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls") pod "success-200-isvc-ac602-predictor-757bd75446-krh9b" (UID: "8eef08cd-81aa-4be2-9abc-b72a91020ae2") : secret "success-200-isvc-ac602-predictor-serving-cert" not found Apr 17 16:52:33.726128 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.726108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8eef08cd-81aa-4be2-9abc-b72a91020ae2-success-200-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.738116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.738095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzpbx\" (UniqueName: \"kubernetes.io/projected/8eef08cd-81aa-4be2-9abc-b72a91020ae2-kube-api-access-qzpbx\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:33.826557 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.826458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd4eaf4-3bae-47ab-853c-dc736ae40134-error-404-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.826557 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.826502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw76q\" (UniqueName: \"kubernetes.io/projected/2fd4eaf4-3bae-47ab-853c-dc736ae40134-kube-api-access-zw76q\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.826822 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.826592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.826822 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.826725 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-ac602-predictor-serving-cert: secret "error-404-isvc-ac602-predictor-serving-cert" not found Apr 17 16:52:33.826822 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.826785 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls podName:2fd4eaf4-3bae-47ab-853c-dc736ae40134 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:34.326767412 +0000 UTC m=+1287.675396940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls") pod "error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" (UID: "2fd4eaf4-3bae-47ab-853c-dc736ae40134") : secret "error-404-isvc-ac602-predictor-serving-cert" not found Apr 17 16:52:33.826988 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.826897 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:33.826988 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:33.826953 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:34.826940689 +0000 UTC m=+1288.175570217 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:33.827256 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.827223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd4eaf4-3bae-47ab-853c-dc736ae40134-error-404-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:33.832411 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.832384 2572 generic.go:358] "Generic (PLEG): container finished" podID="c530e539-fe55-47db-ae30-3447fd68f304" containerID="80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990" exitCode=2 Apr 17 16:52:33.832543 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.832443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" event={"ID":"c530e539-fe55-47db-ae30-3447fd68f304","Type":"ContainerDied","Data":"80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990"} Apr 17 16:52:33.834201 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.834177 2572 generic.go:358] "Generic (PLEG): container finished" podID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerID="54c54ec2a72c624c92fe1d1d4efdee1365e1badfd9862b8e4a93c438474a3e72" exitCode=2 Apr 17 16:52:33.834308 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.834204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" event={"ID":"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6","Type":"ContainerDied","Data":"54c54ec2a72c624c92fe1d1d4efdee1365e1badfd9862b8e4a93c438474a3e72"} Apr 17 16:52:33.834411 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.834393 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" containerID="cri-o://1ce2fefde51a10b19e6c171106026a9c25d796f13e9a2cd530f669d7a3f3a852" gracePeriod=30 Apr 17 16:52:33.836372 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:33.836347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw76q\" (UniqueName: \"kubernetes.io/projected/2fd4eaf4-3bae-47ab-853c-dc736ae40134-kube-api-access-zw76q\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:34.229892 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.229854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:34.232260 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.232234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls\") pod \"success-200-isvc-ac602-predictor-757bd75446-krh9b\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:34.330775 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.330735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:34.333169 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.333146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls\") pod \"error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:34.489875 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.489780 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:34.551481 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.551434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:34.615739 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.615707 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b"] Apr 17 16:52:34.620046 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:52:34.620013 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eef08cd_81aa_4be2_9abc_b72a91020ae2.slice/crio-944d82d1b0c06d05d85748ae20efa1fa8e1816d6f72a57ed7368a0e2d77e4823 WatchSource:0}: Error finding container 944d82d1b0c06d05d85748ae20efa1fa8e1816d6f72a57ed7368a0e2d77e4823: Status 404 returned error can't find the container with id 944d82d1b0c06d05d85748ae20efa1fa8e1816d6f72a57ed7368a0e2d77e4823 Apr 17 16:52:34.621428 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.621404 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:52:34.684832 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.684798 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w"] Apr 17 16:52:34.688594 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:52:34.688566 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd4eaf4_3bae_47ab_853c_dc736ae40134.slice/crio-5aec054d52a9f44bc8b494eb3d41ea06cc11f536ceabc5d9c933b458c1e50268 WatchSource:0}: Error finding container 5aec054d52a9f44bc8b494eb3d41ea06cc11f536ceabc5d9c933b458c1e50268: Status 404 returned error can't find the container with id 5aec054d52a9f44bc8b494eb3d41ea06cc11f536ceabc5d9c933b458c1e50268 Apr 17 16:52:34.835381 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:34.835348 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:34.835513 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:34.835416 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:36.83539591 +0000 UTC m=+1290.184025438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:34.839257 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.839231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" event={"ID":"2fd4eaf4-3bae-47ab-853c-dc736ae40134","Type":"ContainerStarted","Data":"20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f"} Apr 17 16:52:34.839361 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.839267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" event={"ID":"2fd4eaf4-3bae-47ab-853c-dc736ae40134","Type":"ContainerStarted","Data":"e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777"} Apr 17 16:52:34.839361 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.839280 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" event={"ID":"2fd4eaf4-3bae-47ab-853c-dc736ae40134","Type":"ContainerStarted","Data":"5aec054d52a9f44bc8b494eb3d41ea06cc11f536ceabc5d9c933b458c1e50268"} Apr 17 16:52:34.840770 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.840743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" event={"ID":"8eef08cd-81aa-4be2-9abc-b72a91020ae2","Type":"ContainerStarted","Data":"65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2"} Apr 17 16:52:34.840876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.840778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" event={"ID":"8eef08cd-81aa-4be2-9abc-b72a91020ae2","Type":"ContainerStarted","Data":"fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c"} Apr 17 16:52:34.840876 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.840792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" event={"ID":"8eef08cd-81aa-4be2-9abc-b72a91020ae2","Type":"ContainerStarted","Data":"944d82d1b0c06d05d85748ae20efa1fa8e1816d6f72a57ed7368a0e2d77e4823"} Apr 17 16:52:34.840984 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.840948 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:34.857143 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:34.857096 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podStartSLOduration=1.857080153 podStartE2EDuration="1.857080153s" podCreationTimestamp="2026-04-17 16:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:34.85570322 +0000 UTC m=+1288.204332778" watchObservedRunningTime="2026-04-17 16:52:34.857080153 +0000 UTC m=+1288.205709701" Apr 17 16:52:35.844005 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:35.843966 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:35.844410 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:35.844030 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:35.845250 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:35.845225 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:52:35.860318 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:35.860277 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podStartSLOduration=2.860263705 podStartE2EDuration="2.860263705s" podCreationTimestamp="2026-04-17 16:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:35.859318797 +0000 UTC m=+1289.207948345" watchObservedRunningTime="2026-04-17 16:52:35.860263705 +0000 UTC m=+1289.208893253" Apr 17 16:52:36.243144 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.243104 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 17 16:52:36.243370 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.243111 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 17 16:52:36.247537 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.247507 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 16:52:36.247759 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.247735 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 17 16:52:36.444978 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.444945 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:52:36.849202 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849165 2572 generic.go:358] "Generic (PLEG): container finished" podID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerID="311ba39526df52d34918a90d0097de0f02525e31141dcb4025b27968db8da1ad" exitCode=0 Apr 17 16:52:36.849202 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849196 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:52:36.849615 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" event={"ID":"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6","Type":"ContainerDied","Data":"311ba39526df52d34918a90d0097de0f02525e31141dcb4025b27968db8da1ad"} Apr 17 16:52:36.849615 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" event={"ID":"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6","Type":"ContainerDied","Data":"054b7a81abae9acfebf09ebe6b3b557eb1d01aa0b8b33c8da95825b6b81afa90"} Apr 17 16:52:36.849615 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849287 2572 scope.go:117] "RemoveContainer" containerID="54c54ec2a72c624c92fe1d1d4efdee1365e1badfd9862b8e4a93c438474a3e72" Apr 17 16:52:36.849783 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849608 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:52:36.849783 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.849683 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:36.850991 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.850966 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 16:52:36.853556 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:36.853538 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:36.853639 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:36.853592 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:40.853574362 +0000 UTC m=+1294.202203892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:36.857395 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.857380 2572 scope.go:117] "RemoveContainer" containerID="311ba39526df52d34918a90d0097de0f02525e31141dcb4025b27968db8da1ad" Apr 17 16:52:36.954596 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.954565 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-proxy-tls\") pod \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " Apr 17 16:52:36.954741 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.954612 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqd96\" (UniqueName: \"kubernetes.io/projected/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-kube-api-access-wqd96\") pod \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " Apr 17 16:52:36.954741 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.954715 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\" (UID: \"b08b81ef-3e96-4cd5-ac27-13a36d7be0c6\") " Apr 17 16:52:36.955109 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.955054 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-success-200-isvc-ce4d4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ce4d4-kube-rbac-proxy-sar-config") pod "b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" (UID: "b08b81ef-3e96-4cd5-ac27-13a36d7be0c6"). InnerVolumeSpecName "success-200-isvc-ce4d4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:52:36.956758 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.956736 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-kube-api-access-wqd96" (OuterVolumeSpecName: "kube-api-access-wqd96") pod "b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" (UID: "b08b81ef-3e96-4cd5-ac27-13a36d7be0c6"). InnerVolumeSpecName "kube-api-access-wqd96". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:52:36.956926 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:36.956905 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" (UID: "b08b81ef-3e96-4cd5-ac27-13a36d7be0c6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:52:37.055417 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.055384 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-success-200-isvc-ce4d4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:52:37.055417 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.055416 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:52:37.055616 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.055428 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqd96\" (UniqueName: \"kubernetes.io/projected/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6-kube-api-access-wqd96\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:52:37.095475 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.095443 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:52:37.156268 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.156239 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c530e539-fe55-47db-ae30-3447fd68f304-proxy-tls\") pod \"c530e539-fe55-47db-ae30-3447fd68f304\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " Apr 17 16:52:37.156442 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.156299 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtkc\" (UniqueName: \"kubernetes.io/projected/c530e539-fe55-47db-ae30-3447fd68f304-kube-api-access-zrtkc\") pod \"c530e539-fe55-47db-ae30-3447fd68f304\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " Apr 17 16:52:37.156442 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.156341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c530e539-fe55-47db-ae30-3447fd68f304-error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\") pod \"c530e539-fe55-47db-ae30-3447fd68f304\" (UID: \"c530e539-fe55-47db-ae30-3447fd68f304\") " Apr 17 16:52:37.156738 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.156706 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c530e539-fe55-47db-ae30-3447fd68f304-error-404-isvc-ce4d4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ce4d4-kube-rbac-proxy-sar-config") pod "c530e539-fe55-47db-ae30-3447fd68f304" (UID: "c530e539-fe55-47db-ae30-3447fd68f304"). InnerVolumeSpecName "error-404-isvc-ce4d4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:52:37.158274 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.158247 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c530e539-fe55-47db-ae30-3447fd68f304-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c530e539-fe55-47db-ae30-3447fd68f304" (UID: "c530e539-fe55-47db-ae30-3447fd68f304"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:52:37.158347 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.158247 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c530e539-fe55-47db-ae30-3447fd68f304-kube-api-access-zrtkc" (OuterVolumeSpecName: "kube-api-access-zrtkc") pod "c530e539-fe55-47db-ae30-3447fd68f304" (UID: "c530e539-fe55-47db-ae30-3447fd68f304"). InnerVolumeSpecName "kube-api-access-zrtkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:52:37.257032 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.256994 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c530e539-fe55-47db-ae30-3447fd68f304-error-404-isvc-ce4d4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:52:37.257032 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.257026 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c530e539-fe55-47db-ae30-3447fd68f304-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:52:37.257032 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.257036 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrtkc\" (UniqueName: \"kubernetes.io/projected/c530e539-fe55-47db-ae30-3447fd68f304-kube-api-access-zrtkc\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:52:37.853240 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.853205 2572 generic.go:358] "Generic (PLEG): container finished" podID="c530e539-fe55-47db-ae30-3447fd68f304" containerID="f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff" exitCode=0 Apr 17 16:52:37.853732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.853284 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" Apr 17 16:52:37.853732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.853288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" event={"ID":"c530e539-fe55-47db-ae30-3447fd68f304","Type":"ContainerDied","Data":"f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff"} Apr 17 16:52:37.853732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.853325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh" event={"ID":"c530e539-fe55-47db-ae30-3447fd68f304","Type":"ContainerDied","Data":"2df9e9cf4bda669b3835601b6e75be16d6c4079052bc3598982e1cc11d500cd1"} Apr 17 16:52:37.853732 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.853341 2572 scope.go:117] "RemoveContainer" containerID="80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990" Apr 17 16:52:37.854540 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.854242 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr" Apr 17 16:52:37.854742 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.854720 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 16:52:37.861218 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.861199 2572 scope.go:117] "RemoveContainer" containerID="f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff" Apr 17 16:52:37.871059 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.870998 2572 scope.go:117] "RemoveContainer" containerID="80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990" Apr 17 16:52:37.871329 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:37.871309 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990\": container with ID starting with 80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990 not found: ID does not exist" containerID="80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990" Apr 17 16:52:37.871417 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.871342 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990"} err="failed to get container status \"80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990\": rpc error: code = NotFound desc = could not find container \"80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990\": container with ID starting with 80d8c0065fe702acccc1f1177fea2fe07d017c9ade66be66548479c8db6aa990 not found: ID does not exist" Apr 17 16:52:37.871417 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.871367 2572 scope.go:117] "RemoveContainer" containerID="f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff" Apr 17 16:52:37.871721 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:37.871699 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff\": container with ID starting with f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff not found: ID does not exist" containerID="f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff" Apr 17 16:52:37.871777 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.871729 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff"} err="failed to get container status \"f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff\": rpc error: code = NotFound desc = could not find container \"f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff\": container with ID starting with f7094a432a61067dec2680dbb95b34b93d2378b92088ba790fa9a9be3df8a7ff not found: ID does not exist" Apr 17 16:52:37.872771 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.872754 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh"] Apr 17 16:52:37.880249 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.880228 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh"] Apr 17 16:52:37.918635 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.918559 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr"] Apr 17 16:52:37.922868 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:37.922838 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr"] Apr 17 16:52:39.227985 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:39.227942 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" path="/var/lib/kubelet/pods/b08b81ef-3e96-4cd5-ac27-13a36d7be0c6/volumes" Apr 17 16:52:39.228548 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:39.228527 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c530e539-fe55-47db-ae30-3447fd68f304" path="/var/lib/kubelet/pods/c530e539-fe55-47db-ae30-3447fd68f304/volumes" Apr 17 16:52:40.889358 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:40.889310 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:40.889865 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:40.889397 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:48.889377001 +0000 UTC m=+1302.238006545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:41.444322 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:41.444277 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:52:41.854176 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:41.854147 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:52:41.854695 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:41.854640 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:52:42.858667 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:42.858619 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:52:42.859120 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:42.859094 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 16:52:46.445006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:46.444962 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:52:46.445387 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:46.445071 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:52:48.960789 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:48.960757 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce4d4-serving-cert: secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:48.961210 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:52:48.960829 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls podName:cbc24bb6-ef54-446f-b031-2149f3fe5158 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:04.960812802 +0000 UTC m=+1318.309442329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls") pod "switch-graph-ce4d4-66ccfb86c4-sd5b6" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158") : secret "switch-graph-ce4d4-serving-cert" not found Apr 17 16:52:51.449717 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:51.449623 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:52:51.855497 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:51.855402 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:52:52.859335 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:52.859297 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 16:52:56.444900 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:52:56.444852 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:01.444253 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:01.444215 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:01.854790 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:01.854746 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:53:02.859568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:02.859527 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 16:53:03.935036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:03.935003 2572 generic.go:358] "Generic (PLEG): container finished" podID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerID="1ce2fefde51a10b19e6c171106026a9c25d796f13e9a2cd530f669d7a3f3a852" exitCode=0 Apr 17 16:53:03.935386 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:03.935053 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" event={"ID":"cbc24bb6-ef54-446f-b031-2149f3fe5158","Type":"ContainerDied","Data":"1ce2fefde51a10b19e6c171106026a9c25d796f13e9a2cd530f669d7a3f3a852"} Apr 17 16:53:03.970624 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:03.970598 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:53:04.091600 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.091496 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls\") pod \"cbc24bb6-ef54-446f-b031-2149f3fe5158\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " Apr 17 16:53:04.091600 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.091556 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc24bb6-ef54-446f-b031-2149f3fe5158-openshift-service-ca-bundle\") pod \"cbc24bb6-ef54-446f-b031-2149f3fe5158\" (UID: \"cbc24bb6-ef54-446f-b031-2149f3fe5158\") " Apr 17 16:53:04.091946 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.091922 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc24bb6-ef54-446f-b031-2149f3fe5158-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cbc24bb6-ef54-446f-b031-2149f3fe5158" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:04.093575 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.093546 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cbc24bb6-ef54-446f-b031-2149f3fe5158" (UID: "cbc24bb6-ef54-446f-b031-2149f3fe5158"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:04.193235 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.193193 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbc24bb6-ef54-446f-b031-2149f3fe5158-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:04.193235 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.193231 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc24bb6-ef54-446f-b031-2149f3fe5158-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:04.940416 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.940376 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" event={"ID":"cbc24bb6-ef54-446f-b031-2149f3fe5158","Type":"ContainerDied","Data":"262b1f70156d480797c0dab474df5429735a171827867863724dad05b347567d"} Apr 17 16:53:04.940416 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.940388 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6" Apr 17 16:53:04.941002 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.940430 2572 scope.go:117] "RemoveContainer" containerID="1ce2fefde51a10b19e6c171106026a9c25d796f13e9a2cd530f669d7a3f3a852" Apr 17 16:53:04.961224 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.961197 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6"] Apr 17 16:53:04.966641 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:04.966620 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6"] Apr 17 16:53:05.226523 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:05.226475 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" path="/var/lib/kubelet/pods/cbc24bb6-ef54-446f-b031-2149f3fe5158/volumes" Apr 17 16:53:09.027959 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.027921 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx"] Apr 17 16:53:09.028336 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.028221 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" containerID="cri-o://f6eee4f43fbd2e3471937e1bebdc380068ffd9bc80f12453de0a8566635bcb55" gracePeriod=30 Apr 17 16:53:09.128785 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.128750 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm"] Apr 17 16:53:09.129174 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.129121 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" containerID="cri-o://eddd6918a40f9814d394aa4e337367964506b3736e75550981aa9d4fe72afbe6" gracePeriod=30 Apr 17 16:53:09.129519 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.129456 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kube-rbac-proxy" containerID="cri-o://fbc4ca633b639a8179f4d47b7c6130e98da382f682078a038c98d918c88a967b" gracePeriod=30 Apr 17 16:53:09.183709 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.183678 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr"] Apr 17 16:53:09.184191 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184168 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kube-rbac-proxy" Apr 17 16:53:09.184191 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184192 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kube-rbac-proxy" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184206 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184214 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184238 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184246 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184256 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184264 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184279 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kube-rbac-proxy" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184289 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kube-rbac-proxy" Apr 17 16:53:09.184376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184368 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc24bb6-ef54-446f-b031-2149f3fe5158" containerName="switch-graph-ce4d4" Apr 17 16:53:09.184813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184382 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kserve-container" Apr 17 16:53:09.184813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184393 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kube-rbac-proxy" Apr 17 16:53:09.184813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184406 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b08b81ef-3e96-4cd5-ac27-13a36d7be0c6" containerName="kube-rbac-proxy" Apr 17 16:53:09.184813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.184416 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c530e539-fe55-47db-ae30-3447fd68f304" containerName="kserve-container" Apr 17 16:53:09.189567 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.189546 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.191778 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.191756 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d0d22-kube-rbac-proxy-sar-config\"" Apr 17 16:53:09.191778 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.191770 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d0d22-predictor-serving-cert\"" Apr 17 16:53:09.197893 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.197870 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr"] Apr 17 16:53:09.201939 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.201920 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5"] Apr 17 16:53:09.202195 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.202161 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" containerID="cri-o://f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4" gracePeriod=30 Apr 17 16:53:09.202269 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.202243 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kube-rbac-proxy" containerID="cri-o://d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275" gracePeriod=30 Apr 17 16:53:09.229213 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.229149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-proxy-tls\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.229213 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.229204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-success-200-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.229553 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.229247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d2wx\" (UniqueName: \"kubernetes.io/projected/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-kube-api-access-8d2wx\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.274407 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.274372 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck"] Apr 17 16:53:09.277915 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.277894 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.280381 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.280323 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d0d22-predictor-serving-cert\"" Apr 17 16:53:09.280381 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.280348 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d0d22-kube-rbac-proxy-sar-config\"" Apr 17 16:53:09.286291 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.286270 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck"] Apr 17 16:53:09.329762 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.329732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-proxy-tls\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.329930 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.329774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-success-200-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.329930 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.329815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.329930 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.329846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d2wx\" (UniqueName: \"kubernetes.io/projected/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-kube-api-access-8d2wx\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.330119 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.329964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-error-404-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.330119 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.330020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvxtj\" (UniqueName: \"kubernetes.io/projected/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-kube-api-access-wvxtj\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.330637 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.330614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-success-200-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.332205 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.332185 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-proxy-tls\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.339540 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.339506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d2wx\" (UniqueName: \"kubernetes.io/projected/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-kube-api-access-8d2wx\") pod \"success-200-isvc-d0d22-predictor-65f777864b-fb2rr\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.431462 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.431426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-error-404-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.431675 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.431475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvxtj\" (UniqueName: \"kubernetes.io/projected/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-kube-api-access-wvxtj\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.431675 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.431528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.431808 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:09.431680 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-serving-cert: secret "error-404-isvc-d0d22-predictor-serving-cert" not found Apr 17 16:53:09.431808 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:09.431768 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls podName:e0f6cda1-d31a-47bf-bca2-34508ccf7d7e nodeName:}" failed. No retries permitted until 2026-04-17 16:53:09.931749968 +0000 UTC m=+1323.280379501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls") pod "error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" (UID: "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e") : secret "error-404-isvc-d0d22-predictor-serving-cert" not found Apr 17 16:53:09.432058 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.432035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-error-404-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.440116 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.440086 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvxtj\" (UniqueName: \"kubernetes.io/projected/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-kube-api-access-wvxtj\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.500749 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.500715 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.620359 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.620318 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr"] Apr 17 16:53:09.624713 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:53:09.624677 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f8ebf7_b46f_4015_b8d2_49f38ee1dceb.slice/crio-a628befeb1a8519cf28afcb784a8818b99e82c1a9b39dcd1501dd3cdd9beb645 WatchSource:0}: Error finding container a628befeb1a8519cf28afcb784a8818b99e82c1a9b39dcd1501dd3cdd9beb645: Status 404 returned error can't find the container with id a628befeb1a8519cf28afcb784a8818b99e82c1a9b39dcd1501dd3cdd9beb645 Apr 17 16:53:09.935998 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.935907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.938410 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.938383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls\") pod \"error-404-isvc-d0d22-predictor-76db56b8f-hz2ck\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:09.956996 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.956967 2572 generic.go:358] "Generic (PLEG): container finished" podID="c1417668-656a-445d-89b2-53c582a08559" containerID="d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275" exitCode=2 Apr 17 16:53:09.957115 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.957038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" event={"ID":"c1417668-656a-445d-89b2-53c582a08559","Type":"ContainerDied","Data":"d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275"} Apr 17 16:53:09.958555 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.958534 2572 generic.go:358] "Generic (PLEG): container finished" podID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerID="fbc4ca633b639a8179f4d47b7c6130e98da382f682078a038c98d918c88a967b" exitCode=2 Apr 17 16:53:09.958680 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.958601 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" event={"ID":"432dc8ff-e1ad-414b-a836-0f16a600f03e","Type":"ContainerDied","Data":"fbc4ca633b639a8179f4d47b7c6130e98da382f682078a038c98d918c88a967b"} Apr 17 16:53:09.960042 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.960024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" event={"ID":"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb","Type":"ContainerStarted","Data":"2551ac7d78a4435f9578db6bd1a942cde897c29337b4aa71c4b48e830eaca67f"} Apr 17 16:53:09.960122 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.960047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" event={"ID":"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb","Type":"ContainerStarted","Data":"5a2b0fe91be070865f9fe021bda7927cd22460f70a55f91062d28bfbe507c686"} Apr 17 16:53:09.960122 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.960056 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" event={"ID":"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb","Type":"ContainerStarted","Data":"a628befeb1a8519cf28afcb784a8818b99e82c1a9b39dcd1501dd3cdd9beb645"} Apr 17 16:53:09.960199 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.960168 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:09.977356 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:09.977307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podStartSLOduration=0.977295709 podStartE2EDuration="977.295709ms" podCreationTimestamp="2026-04-17 16:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:09.97532172 +0000 UTC m=+1323.323951269" watchObservedRunningTime="2026-04-17 16:53:09.977295709 +0000 UTC m=+1323.325925257" Apr 17 16:53:10.189462 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.189358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:10.314100 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.314076 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck"] Apr 17 16:53:10.316317 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:53:10.316286 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f6cda1_d31a_47bf_bca2_34508ccf7d7e.slice/crio-1fc88a91bce45a6f8b59d1a56c1aa909828e8ec3e5672719b46eb019f2d31029 WatchSource:0}: Error finding container 1fc88a91bce45a6f8b59d1a56c1aa909828e8ec3e5672719b46eb019f2d31029: Status 404 returned error can't find the container with id 1fc88a91bce45a6f8b59d1a56c1aa909828e8ec3e5672719b46eb019f2d31029 Apr 17 16:53:10.965013 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.964972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" event={"ID":"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e","Type":"ContainerStarted","Data":"69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f"} Apr 17 16:53:10.965013 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.965016 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" event={"ID":"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e","Type":"ContainerStarted","Data":"b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf"} Apr 17 16:53:10.965291 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.965031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" event={"ID":"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e","Type":"ContainerStarted","Data":"1fc88a91bce45a6f8b59d1a56c1aa909828e8ec3e5672719b46eb019f2d31029"} Apr 17 16:53:10.965512 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.965494 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:10.965602 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.965520 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:10.966556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.966528 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:10.966673 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.966530 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:10.984439 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:10.984399 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podStartSLOduration=1.98438322 podStartE2EDuration="1.98438322s" podCreationTimestamp="2026-04-17 16:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:10.982368437 +0000 UTC m=+1324.330997998" watchObservedRunningTime="2026-04-17 16:53:10.98438322 +0000 UTC m=+1324.333012769" Apr 17 16:53:11.365831 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.365731 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.27:8643/healthz\": dial tcp 10.132.0.27:8643: connect: connection refused" Apr 17 16:53:11.370555 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.370528 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 16:53:11.558963 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.558919 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:11.855117 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.855074 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:53:11.968837 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.968795 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:11.969024 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.968854 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:11.969166 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:11.969060 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:12.859848 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:12.859751 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 16:53:12.973339 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:12.973305 2572 generic.go:358] "Generic (PLEG): container finished" podID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerID="eddd6918a40f9814d394aa4e337367964506b3736e75550981aa9d4fe72afbe6" exitCode=0 Apr 17 16:53:12.973479 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:12.973381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" event={"ID":"432dc8ff-e1ad-414b-a836-0f16a600f03e","Type":"ContainerDied","Data":"eddd6918a40f9814d394aa4e337367964506b3736e75550981aa9d4fe72afbe6"} Apr 17 16:53:12.973787 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:12.973757 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:13.177915 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.177892 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:53:13.264022 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.263987 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/432dc8ff-e1ad-414b-a836-0f16a600f03e-proxy-tls\") pod \"432dc8ff-e1ad-414b-a836-0f16a600f03e\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " Apr 17 16:53:13.264182 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.264037 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4h84\" (UniqueName: \"kubernetes.io/projected/432dc8ff-e1ad-414b-a836-0f16a600f03e-kube-api-access-k4h84\") pod \"432dc8ff-e1ad-414b-a836-0f16a600f03e\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " Apr 17 16:53:13.264182 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.264102 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/432dc8ff-e1ad-414b-a836-0f16a600f03e-success-200-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"432dc8ff-e1ad-414b-a836-0f16a600f03e\" (UID: \"432dc8ff-e1ad-414b-a836-0f16a600f03e\") " Apr 17 16:53:13.264506 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.264477 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/432dc8ff-e1ad-414b-a836-0f16a600f03e-success-200-isvc-8b822-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-8b822-kube-rbac-proxy-sar-config") pod "432dc8ff-e1ad-414b-a836-0f16a600f03e" (UID: "432dc8ff-e1ad-414b-a836-0f16a600f03e"). InnerVolumeSpecName "success-200-isvc-8b822-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:13.319440 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.319168 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/432dc8ff-e1ad-414b-a836-0f16a600f03e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "432dc8ff-e1ad-414b-a836-0f16a600f03e" (UID: "432dc8ff-e1ad-414b-a836-0f16a600f03e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:13.319440 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.319175 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432dc8ff-e1ad-414b-a836-0f16a600f03e-kube-api-access-k4h84" (OuterVolumeSpecName: "kube-api-access-k4h84") pod "432dc8ff-e1ad-414b-a836-0f16a600f03e" (UID: "432dc8ff-e1ad-414b-a836-0f16a600f03e"). InnerVolumeSpecName "kube-api-access-k4h84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:53:13.365449 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.365418 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/432dc8ff-e1ad-414b-a836-0f16a600f03e-success-200-isvc-8b822-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:13.365449 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.365445 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/432dc8ff-e1ad-414b-a836-0f16a600f03e-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:13.365449 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.365456 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4h84\" (UniqueName: \"kubernetes.io/projected/432dc8ff-e1ad-414b-a836-0f16a600f03e-kube-api-access-k4h84\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:13.373524 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.373486 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.28:8643/healthz\": dial tcp 10.132.0.28:8643: connect: connection refused" Apr 17 16:53:13.380547 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.380524 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 16:53:13.844715 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.844694 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:53:13.868474 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.868447 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1417668-656a-445d-89b2-53c582a08559-error-404-isvc-8b822-kube-rbac-proxy-sar-config\") pod \"c1417668-656a-445d-89b2-53c582a08559\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " Apr 17 16:53:13.868913 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.868486 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dk88\" (UniqueName: \"kubernetes.io/projected/c1417668-656a-445d-89b2-53c582a08559-kube-api-access-7dk88\") pod \"c1417668-656a-445d-89b2-53c582a08559\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " Apr 17 16:53:13.868913 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.868532 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls\") pod \"c1417668-656a-445d-89b2-53c582a08559\" (UID: \"c1417668-656a-445d-89b2-53c582a08559\") " Apr 17 16:53:13.868913 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.868778 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1417668-656a-445d-89b2-53c582a08559-error-404-isvc-8b822-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-8b822-kube-rbac-proxy-sar-config") pod "c1417668-656a-445d-89b2-53c582a08559" (UID: "c1417668-656a-445d-89b2-53c582a08559"). InnerVolumeSpecName "error-404-isvc-8b822-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:13.870674 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.870636 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c1417668-656a-445d-89b2-53c582a08559" (UID: "c1417668-656a-445d-89b2-53c582a08559"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:13.870674 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.870639 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1417668-656a-445d-89b2-53c582a08559-kube-api-access-7dk88" (OuterVolumeSpecName: "kube-api-access-7dk88") pod "c1417668-656a-445d-89b2-53c582a08559" (UID: "c1417668-656a-445d-89b2-53c582a08559"). InnerVolumeSpecName "kube-api-access-7dk88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:53:13.970064 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.969971 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1417668-656a-445d-89b2-53c582a08559-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:13.970064 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.970003 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-8b822-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c1417668-656a-445d-89b2-53c582a08559-error-404-isvc-8b822-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:13.970064 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.970014 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7dk88\" (UniqueName: \"kubernetes.io/projected/c1417668-656a-445d-89b2-53c582a08559-kube-api-access-7dk88\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:13.977942 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.977909 2572 generic.go:358] "Generic (PLEG): container finished" podID="c1417668-656a-445d-89b2-53c582a08559" containerID="f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4" exitCode=0 Apr 17 16:53:13.978071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.977997 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" Apr 17 16:53:13.978071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.977999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" event={"ID":"c1417668-656a-445d-89b2-53c582a08559","Type":"ContainerDied","Data":"f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4"} Apr 17 16:53:13.978071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.978039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5" event={"ID":"c1417668-656a-445d-89b2-53c582a08559","Type":"ContainerDied","Data":"1f276f264b0f95aa5b855b4f1e24902c1d5beca81b871515bcb67fdb18bba7ec"} Apr 17 16:53:13.978071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.978060 2572 scope.go:117] "RemoveContainer" containerID="d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275" Apr 17 16:53:13.979535 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.979512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" event={"ID":"432dc8ff-e1ad-414b-a836-0f16a600f03e","Type":"ContainerDied","Data":"223697c1e294b645fefdc43904080be84eb3095e30d2076e9cea3946eb61184a"} Apr 17 16:53:13.979535 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.979530 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm" Apr 17 16:53:13.988322 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.988302 2572 scope.go:117] "RemoveContainer" containerID="f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4" Apr 17 16:53:13.995735 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.995719 2572 scope.go:117] "RemoveContainer" containerID="d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275" Apr 17 16:53:13.995984 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:13.995965 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275\": container with ID starting with d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275 not found: ID does not exist" containerID="d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275" Apr 17 16:53:13.996048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.995997 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275"} err="failed to get container status \"d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275\": rpc error: code = NotFound desc = could not find container \"d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275\": container with ID starting with d2ffb0a7e61b2046abcf243e436b9e84095ac5372a7428e83fa993a058b79275 not found: ID does not exist" Apr 17 16:53:13.996048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.996020 2572 scope.go:117] "RemoveContainer" containerID="f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4" Apr 17 16:53:13.996255 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:13.996240 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4\": container with ID starting with f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4 not found: ID does not exist" containerID="f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4" Apr 17 16:53:13.996290 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.996262 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4"} err="failed to get container status \"f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4\": rpc error: code = NotFound desc = could not find container \"f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4\": container with ID starting with f9583fe4c2fe445ac9b9558063ad26626dbee815d6b798f2236db9f0b0e1a3e4 not found: ID does not exist" Apr 17 16:53:13.996290 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:13.996277 2572 scope.go:117] "RemoveContainer" containerID="fbc4ca633b639a8179f4d47b7c6130e98da382f682078a038c98d918c88a967b" Apr 17 16:53:14.001667 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:14.001624 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5"] Apr 17 16:53:14.004144 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:14.004123 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5"] Apr 17 16:53:14.005244 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:14.005226 2572 scope.go:117] "RemoveContainer" containerID="eddd6918a40f9814d394aa4e337367964506b3736e75550981aa9d4fe72afbe6" Apr 17 16:53:14.012783 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:14.012765 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm"] Apr 17 16:53:14.016520 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:14.016501 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm"] Apr 17 16:53:15.229408 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:15.229371 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" path="/var/lib/kubelet/pods/432dc8ff-e1ad-414b-a836-0f16a600f03e/volumes" Apr 17 16:53:15.229833 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:15.229818 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1417668-656a-445d-89b2-53c582a08559" path="/var/lib/kubelet/pods/c1417668-656a-445d-89b2-53c582a08559/volumes" Apr 17 16:53:16.559827 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:16.559784 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:16.973577 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:16.973547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:53:16.974316 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:16.974281 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:17.978239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:17.978206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:53:17.978781 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:17.978751 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:21.559692 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:21.559638 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:21.560067 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:21.559766 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:53:21.855328 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:21.855240 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 16:53:22.859850 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:22.859819 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:53:26.560133 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:26.560049 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:26.974756 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:26.974722 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:27.979638 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:27.979597 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:31.559318 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:31.559273 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:31.854864 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:31.854788 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:53:36.559597 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:36.559546 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:36.975201 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:36.975160 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:37.978987 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:37.978947 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:39.058903 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.058867 2572 generic.go:358] "Generic (PLEG): container finished" podID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerID="f6eee4f43fbd2e3471937e1bebdc380068ffd9bc80f12453de0a8566635bcb55" exitCode=0 Apr 17 16:53:39.059242 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.058926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" event={"ID":"174c6520-09a4-48e5-a2b4-4ed9e1d3f556","Type":"ContainerDied","Data":"f6eee4f43fbd2e3471937e1bebdc380068ffd9bc80f12453de0a8566635bcb55"} Apr 17 16:53:39.172342 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.172318 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:53:39.275362 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.275330 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-proxy-tls\") pod \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " Apr 17 16:53:39.275550 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.275385 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-openshift-service-ca-bundle\") pod \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\" (UID: \"174c6520-09a4-48e5-a2b4-4ed9e1d3f556\") " Apr 17 16:53:39.276130 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.275839 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "174c6520-09a4-48e5-a2b4-4ed9e1d3f556" (UID: "174c6520-09a4-48e5-a2b4-4ed9e1d3f556"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:39.276130 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.275981 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:39.277586 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.277559 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "174c6520-09a4-48e5-a2b4-4ed9e1d3f556" (UID: "174c6520-09a4-48e5-a2b4-4ed9e1d3f556"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:39.377228 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:39.377144 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/174c6520-09a4-48e5-a2b4-4ed9e1d3f556-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:40.062545 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:40.062517 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" Apr 17 16:53:40.063006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:40.062515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx" event={"ID":"174c6520-09a4-48e5-a2b4-4ed9e1d3f556","Type":"ContainerDied","Data":"5d008f6c8585b62e7fc7b28051081dd489d9fd77d07d98ae59c09837c2510d5d"} Apr 17 16:53:40.063006 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:40.062629 2572 scope.go:117] "RemoveContainer" containerID="f6eee4f43fbd2e3471937e1bebdc380068ffd9bc80f12453de0a8566635bcb55" Apr 17 16:53:40.081983 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:40.081958 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx"] Apr 17 16:53:40.086556 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:40.086532 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx"] Apr 17 16:53:41.231866 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:41.231060 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" path="/var/lib/kubelet/pods/174c6520-09a4-48e5-a2b4-4ed9e1d3f556/volumes" Apr 17 16:53:43.531729 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.531639 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf"] Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.531992 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532004 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532015 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kube-rbac-proxy" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532020 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kube-rbac-proxy" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532026 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532032 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532040 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532046 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532051 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kube-rbac-proxy" Apr 17 16:53:43.532070 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532057 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kube-rbac-proxy" Apr 17 16:53:43.532375 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532107 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kserve-container" Apr 17 16:53:43.532375 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532116 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kserve-container" Apr 17 16:53:43.532375 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532124 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="174c6520-09a4-48e5-a2b4-4ed9e1d3f556" containerName="sequence-graph-8b822" Apr 17 16:53:43.532375 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532131 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="432dc8ff-e1ad-414b-a836-0f16a600f03e" containerName="kube-rbac-proxy" Apr 17 16:53:43.532375 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.532137 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1417668-656a-445d-89b2-53c582a08559" containerName="kube-rbac-proxy" Apr 17 16:53:43.536196 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.536178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.538606 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.538579 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-ac602-serving-cert\"" Apr 17 16:53:43.538765 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.538677 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-ac602-kube-rbac-proxy-sar-config\"" Apr 17 16:53:43.543766 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.543227 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf"] Apr 17 16:53:43.612037 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.611997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3c11182-aa87-45d2-af1e-3d53777fccea-openshift-service-ca-bundle\") pod \"ensemble-graph-ac602-7c945b5b88-wwhgf\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.612256 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.612065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3c11182-aa87-45d2-af1e-3d53777fccea-proxy-tls\") pod \"ensemble-graph-ac602-7c945b5b88-wwhgf\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.712854 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.712816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3c11182-aa87-45d2-af1e-3d53777fccea-openshift-service-ca-bundle\") pod \"ensemble-graph-ac602-7c945b5b88-wwhgf\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.713026 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.712871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3c11182-aa87-45d2-af1e-3d53777fccea-proxy-tls\") pod \"ensemble-graph-ac602-7c945b5b88-wwhgf\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.713465 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.713443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3c11182-aa87-45d2-af1e-3d53777fccea-openshift-service-ca-bundle\") pod \"ensemble-graph-ac602-7c945b5b88-wwhgf\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.715215 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.715195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3c11182-aa87-45d2-af1e-3d53777fccea-proxy-tls\") pod \"ensemble-graph-ac602-7c945b5b88-wwhgf\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.848242 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.848153 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:43.965286 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:43.965253 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf"] Apr 17 16:53:43.967845 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:53:43.967818 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c11182_aa87_45d2_af1e_3d53777fccea.slice/crio-27d8c07bf475b529771044ed6f77a7f55ee0f8a3e503a56aa95f419f234248f8 WatchSource:0}: Error finding container 27d8c07bf475b529771044ed6f77a7f55ee0f8a3e503a56aa95f419f234248f8: Status 404 returned error can't find the container with id 27d8c07bf475b529771044ed6f77a7f55ee0f8a3e503a56aa95f419f234248f8 Apr 17 16:53:44.075033 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:44.074988 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" event={"ID":"b3c11182-aa87-45d2-af1e-3d53777fccea","Type":"ContainerStarted","Data":"23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695"} Apr 17 16:53:44.075033 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:44.075039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" event={"ID":"b3c11182-aa87-45d2-af1e-3d53777fccea","Type":"ContainerStarted","Data":"27d8c07bf475b529771044ed6f77a7f55ee0f8a3e503a56aa95f419f234248f8"} Apr 17 16:53:44.075262 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:44.075072 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:44.090853 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:44.090803 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podStartSLOduration=1.090787416 podStartE2EDuration="1.090787416s" podCreationTimestamp="2026-04-17 16:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:44.089050798 +0000 UTC m=+1357.437680350" watchObservedRunningTime="2026-04-17 16:53:44.090787416 +0000 UTC m=+1357.439416965" Apr 17 16:53:46.974607 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:46.974570 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:47.979583 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:47.979547 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:50.084493 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:50.084464 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:53:53.585542 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.585509 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf"] Apr 17 16:53:53.586013 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.585745 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" containerID="cri-o://23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695" gracePeriod=30 Apr 17 16:53:53.691217 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.691185 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b"] Apr 17 16:53:53.691510 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.691466 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" containerID="cri-o://fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c" gracePeriod=30 Apr 17 16:53:53.691590 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.691513 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kube-rbac-proxy" containerID="cri-o://65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2" gracePeriod=30 Apr 17 16:53:53.726510 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.726476 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6"] Apr 17 16:53:53.731629 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.731605 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:53.734069 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.734049 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c68dd-predictor-serving-cert\"" Apr 17 16:53:53.734232 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.734106 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c68dd-kube-rbac-proxy-sar-config\"" Apr 17 16:53:53.740263 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.740238 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6"] Apr 17 16:53:53.762241 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.762215 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w"] Apr 17 16:53:53.762508 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.762483 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" containerID="cri-o://e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777" gracePeriod=30 Apr 17 16:53:53.762508 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.762497 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kube-rbac-proxy" containerID="cri-o://20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f" gracePeriod=30 Apr 17 16:53:53.830764 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.830733 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq"] Apr 17 16:53:53.833982 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.833962 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:53.836239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.836185 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c68dd-kube-rbac-proxy-sar-config\"" Apr 17 16:53:53.836239 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.836184 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c68dd-predictor-serving-cert\"" Apr 17 16:53:53.844217 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.844188 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq"] Apr 17 16:53:53.900902 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.900871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b10bfd64-2fd9-43bd-9507-303d487de5a1-success-200-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:53.901027 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.900919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b10bfd64-2fd9-43bd-9507-303d487de5a1-proxy-tls\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:53.901027 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:53.900986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbqw2\" (UniqueName: \"kubernetes.io/projected/b10bfd64-2fd9-43bd-9507-303d487de5a1-kube-api-access-wbqw2\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.001678 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.001609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbqw2\" (UniqueName: \"kubernetes.io/projected/b10bfd64-2fd9-43bd-9507-303d487de5a1-kube-api-access-wbqw2\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.001678 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.001675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7scb\" (UniqueName: \"kubernetes.io/projected/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-kube-api-access-w7scb\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.001934 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.001742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-proxy-tls\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.001934 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.001852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b10bfd64-2fd9-43bd-9507-303d487de5a1-success-200-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.001934 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.001905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b10bfd64-2fd9-43bd-9507-303d487de5a1-proxy-tls\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.002099 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.001942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-error-404-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.002561 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.002537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b10bfd64-2fd9-43bd-9507-303d487de5a1-success-200-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.004929 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.004906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b10bfd64-2fd9-43bd-9507-303d487de5a1-proxy-tls\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.009892 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.009867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbqw2\" (UniqueName: \"kubernetes.io/projected/b10bfd64-2fd9-43bd-9507-303d487de5a1-kube-api-access-wbqw2\") pod \"success-200-isvc-c68dd-predictor-57f664bd48-xzcc6\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.048337 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.048305 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:54.102572 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.102540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7scb\" (UniqueName: \"kubernetes.io/projected/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-kube-api-access-w7scb\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.102737 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.102605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-proxy-tls\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.102737 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.102696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-error-404-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.103273 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.103241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-error-404-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.105585 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.105520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-proxy-tls\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.109596 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.109559 2572 generic.go:358] "Generic (PLEG): container finished" podID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerID="20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f" exitCode=2 Apr 17 16:53:54.109727 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.109600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" event={"ID":"2fd4eaf4-3bae-47ab-853c-dc736ae40134","Type":"ContainerDied","Data":"20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f"} Apr 17 16:53:54.111745 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.111587 2572 generic.go:358] "Generic (PLEG): container finished" podID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerID="65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2" exitCode=2 Apr 17 16:53:54.111745 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.111694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" event={"ID":"8eef08cd-81aa-4be2-9abc-b72a91020ae2","Type":"ContainerDied","Data":"65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2"} Apr 17 16:53:54.111911 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.111875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7scb\" (UniqueName: \"kubernetes.io/projected/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-kube-api-access-w7scb\") pod \"error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.147364 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.147325 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:54.176821 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.176795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6"] Apr 17 16:53:54.179746 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:53:54.179715 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb10bfd64_2fd9_43bd_9507_303d487de5a1.slice/crio-ba2126324612064d940a558216f12eb2d011cdbf38d67d71fe16839aec9f59b5 WatchSource:0}: Error finding container ba2126324612064d940a558216f12eb2d011cdbf38d67d71fe16839aec9f59b5: Status 404 returned error can't find the container with id ba2126324612064d940a558216f12eb2d011cdbf38d67d71fe16839aec9f59b5 Apr 17 16:53:54.278894 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:54.278868 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq"] Apr 17 16:53:54.295899 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:53:54.295871 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd7ff5e_fc98_4023_8ffc_c410bedc4ae1.slice/crio-f496bf844e1ebc30d9b761f90373f776d6d8e62951efac19c72dfc77f5c21031 WatchSource:0}: Error finding container f496bf844e1ebc30d9b761f90373f776d6d8e62951efac19c72dfc77f5c21031: Status 404 returned error can't find the container with id f496bf844e1ebc30d9b761f90373f776d6d8e62951efac19c72dfc77f5c21031 Apr 17 16:53:55.082527 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.082480 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:55.118388 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.118353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" event={"ID":"b10bfd64-2fd9-43bd-9507-303d487de5a1","Type":"ContainerStarted","Data":"27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff"} Apr 17 16:53:55.118561 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.118395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" event={"ID":"b10bfd64-2fd9-43bd-9507-303d487de5a1","Type":"ContainerStarted","Data":"0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e"} Apr 17 16:53:55.118561 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.118410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" event={"ID":"b10bfd64-2fd9-43bd-9507-303d487de5a1","Type":"ContainerStarted","Data":"ba2126324612064d940a558216f12eb2d011cdbf38d67d71fe16839aec9f59b5"} Apr 17 16:53:55.118561 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.118488 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:55.122214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.120252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" event={"ID":"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1","Type":"ContainerStarted","Data":"e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7"} Apr 17 16:53:55.122214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.120286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" event={"ID":"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1","Type":"ContainerStarted","Data":"1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e"} Apr 17 16:53:55.122214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.120299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" event={"ID":"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1","Type":"ContainerStarted","Data":"f496bf844e1ebc30d9b761f90373f776d6d8e62951efac19c72dfc77f5c21031"} Apr 17 16:53:55.122214 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.120875 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:55.123679 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.122622 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:53:55.123890 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.123861 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:53:55.136617 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.136569 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podStartSLOduration=2.136555003 podStartE2EDuration="2.136555003s" podCreationTimestamp="2026-04-17 16:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:55.135133722 +0000 UTC m=+1368.483763273" watchObservedRunningTime="2026-04-17 16:53:55.136555003 +0000 UTC m=+1368.485184552" Apr 17 16:53:55.152275 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:55.152214 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podStartSLOduration=2.152199289 podStartE2EDuration="2.152199289s" podCreationTimestamp="2026-04-17 16:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:55.151162353 +0000 UTC m=+1368.499791901" watchObservedRunningTime="2026-04-17 16:53:55.152199289 +0000 UTC m=+1368.500828838" Apr 17 16:53:56.126153 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:56.126125 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:53:56.126724 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:56.126695 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:53:56.127628 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:56.127599 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:53:56.850090 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:56.850055 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 17 16:53:56.974247 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:56.974210 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 16:53:57.129281 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.129191 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:53:57.129281 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.129265 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:53:57.452746 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.452722 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:53:57.632347 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.632309 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls\") pod \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " Apr 17 16:53:57.632516 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.632402 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8eef08cd-81aa-4be2-9abc-b72a91020ae2-success-200-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " Apr 17 16:53:57.632516 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.632436 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzpbx\" (UniqueName: \"kubernetes.io/projected/8eef08cd-81aa-4be2-9abc-b72a91020ae2-kube-api-access-qzpbx\") pod \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\" (UID: \"8eef08cd-81aa-4be2-9abc-b72a91020ae2\") " Apr 17 16:53:57.632834 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.632794 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eef08cd-81aa-4be2-9abc-b72a91020ae2-success-200-isvc-ac602-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ac602-kube-rbac-proxy-sar-config") pod "8eef08cd-81aa-4be2-9abc-b72a91020ae2" (UID: "8eef08cd-81aa-4be2-9abc-b72a91020ae2"). InnerVolumeSpecName "success-200-isvc-ac602-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:57.634477 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.634454 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eef08cd-81aa-4be2-9abc-b72a91020ae2-kube-api-access-qzpbx" (OuterVolumeSpecName: "kube-api-access-qzpbx") pod "8eef08cd-81aa-4be2-9abc-b72a91020ae2" (UID: "8eef08cd-81aa-4be2-9abc-b72a91020ae2"). InnerVolumeSpecName "kube-api-access-qzpbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:53:57.634555 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.634534 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8eef08cd-81aa-4be2-9abc-b72a91020ae2" (UID: "8eef08cd-81aa-4be2-9abc-b72a91020ae2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:57.733611 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.733580 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8eef08cd-81aa-4be2-9abc-b72a91020ae2-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:57.733611 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.733611 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8eef08cd-81aa-4be2-9abc-b72a91020ae2-success-200-isvc-ac602-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:57.733777 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.733622 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzpbx\" (UniqueName: \"kubernetes.io/projected/8eef08cd-81aa-4be2-9abc-b72a91020ae2-kube-api-access-qzpbx\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:57.891058 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.891032 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:53:57.979077 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:57.978988 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 16:53:58.035041 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.035006 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls\") pod \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " Apr 17 16:53:58.035196 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.035143 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw76q\" (UniqueName: \"kubernetes.io/projected/2fd4eaf4-3bae-47ab-853c-dc736ae40134-kube-api-access-zw76q\") pod \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " Apr 17 16:53:58.035238 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.035191 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd4eaf4-3bae-47ab-853c-dc736ae40134-error-404-isvc-ac602-kube-rbac-proxy-sar-config\") pod \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\" (UID: \"2fd4eaf4-3bae-47ab-853c-dc736ae40134\") " Apr 17 16:53:58.035587 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.035562 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd4eaf4-3bae-47ab-853c-dc736ae40134-error-404-isvc-ac602-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ac602-kube-rbac-proxy-sar-config") pod "2fd4eaf4-3bae-47ab-853c-dc736ae40134" (UID: "2fd4eaf4-3bae-47ab-853c-dc736ae40134"). InnerVolumeSpecName "error-404-isvc-ac602-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:58.037099 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.037071 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2fd4eaf4-3bae-47ab-853c-dc736ae40134" (UID: "2fd4eaf4-3bae-47ab-853c-dc736ae40134"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:58.037188 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.037113 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd4eaf4-3bae-47ab-853c-dc736ae40134-kube-api-access-zw76q" (OuterVolumeSpecName: "kube-api-access-zw76q") pod "2fd4eaf4-3bae-47ab-853c-dc736ae40134" (UID: "2fd4eaf4-3bae-47ab-853c-dc736ae40134"). InnerVolumeSpecName "kube-api-access-zw76q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:53:58.133485 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.133450 2572 generic.go:358] "Generic (PLEG): container finished" podID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerID="e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777" exitCode=0 Apr 17 16:53:58.133889 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.133520 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" Apr 17 16:53:58.133889 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.133527 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" event={"ID":"2fd4eaf4-3bae-47ab-853c-dc736ae40134","Type":"ContainerDied","Data":"e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777"} Apr 17 16:53:58.133889 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.133563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" event={"ID":"2fd4eaf4-3bae-47ab-853c-dc736ae40134","Type":"ContainerDied","Data":"5aec054d52a9f44bc8b494eb3d41ea06cc11f536ceabc5d9c933b458c1e50268"} Apr 17 16:53:58.133889 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.133584 2572 scope.go:117] "RemoveContainer" containerID="20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f" Apr 17 16:53:58.135131 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135104 2572 generic.go:358] "Generic (PLEG): container finished" podID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerID="fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c" exitCode=0 Apr 17 16:53:58.135224 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135168 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" event={"ID":"8eef08cd-81aa-4be2-9abc-b72a91020ae2","Type":"ContainerDied","Data":"fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c"} Apr 17 16:53:58.135224 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135194 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" Apr 17 16:53:58.135302 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b" event={"ID":"8eef08cd-81aa-4be2-9abc-b72a91020ae2","Type":"ContainerDied","Data":"944d82d1b0c06d05d85748ae20efa1fa8e1816d6f72a57ed7368a0e2d77e4823"} Apr 17 16:53:58.135894 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135878 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zw76q\" (UniqueName: \"kubernetes.io/projected/2fd4eaf4-3bae-47ab-853c-dc736ae40134-kube-api-access-zw76q\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:58.135944 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135899 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ac602-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd4eaf4-3bae-47ab-853c-dc736ae40134-error-404-isvc-ac602-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:58.135944 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.135912 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd4eaf4-3bae-47ab-853c-dc736ae40134-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:53:58.143366 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.143351 2572 scope.go:117] "RemoveContainer" containerID="e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777" Apr 17 16:53:58.150212 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.150190 2572 scope.go:117] "RemoveContainer" containerID="20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f" Apr 17 16:53:58.150439 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:58.150421 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f\": container with ID starting with 20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f not found: ID does not exist" containerID="20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f" Apr 17 16:53:58.150488 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.150448 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f"} err="failed to get container status \"20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f\": rpc error: code = NotFound desc = could not find container \"20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f\": container with ID starting with 20a34d38a20b55b791d2d77de49f43922f034e658a7801268bedd29dd4aadf4f not found: ID does not exist" Apr 17 16:53:58.150488 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.150465 2572 scope.go:117] "RemoveContainer" containerID="e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777" Apr 17 16:53:58.150709 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:58.150689 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777\": container with ID starting with e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777 not found: ID does not exist" containerID="e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777" Apr 17 16:53:58.150751 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.150717 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777"} err="failed to get container status \"e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777\": rpc error: code = NotFound desc = could not find container \"e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777\": container with ID starting with e164a07a835b7daee078a4099c7d65b9217795de873e396ba98e6849cd0ea777 not found: ID does not exist" Apr 17 16:53:58.150751 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.150738 2572 scope.go:117] "RemoveContainer" containerID="65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2" Apr 17 16:53:58.155709 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.155690 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w"] Apr 17 16:53:58.158226 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.158185 2572 scope.go:117] "RemoveContainer" containerID="fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c" Apr 17 16:53:58.159934 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.159915 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w"] Apr 17 16:53:58.165084 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.165067 2572 scope.go:117] "RemoveContainer" containerID="65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2" Apr 17 16:53:58.165337 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:58.165318 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2\": container with ID starting with 65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2 not found: ID does not exist" containerID="65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2" Apr 17 16:53:58.165389 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.165342 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2"} err="failed to get container status \"65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2\": rpc error: code = NotFound desc = could not find container \"65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2\": container with ID starting with 65a4893582a220b13579dd6f7dd6e902a7f098320f96bf7b7c70923e1cc3bdc2 not found: ID does not exist" Apr 17 16:53:58.165389 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.165360 2572 scope.go:117] "RemoveContainer" containerID="fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c" Apr 17 16:53:58.165565 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:53:58.165548 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c\": container with ID starting with fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c not found: ID does not exist" containerID="fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c" Apr 17 16:53:58.165604 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.165568 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c"} err="failed to get container status \"fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c\": rpc error: code = NotFound desc = could not find container \"fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c\": container with ID starting with fc19769a475bc1a95cb835656deb9169e08fe12d8c3ea02dbc6133c01266541c not found: ID does not exist" Apr 17 16:53:58.170040 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.170020 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b"] Apr 17 16:53:58.173881 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.173861 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b"] Apr 17 16:53:58.855577 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:58.855527 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": context deadline exceeded" Apr 17 16:53:59.227602 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:59.227561 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" path="/var/lib/kubelet/pods/2fd4eaf4-3bae-47ab-853c-dc736ae40134/volumes" Apr 17 16:53:59.228237 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:53:59.228211 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" path="/var/lib/kubelet/pods/8eef08cd-81aa-4be2-9abc-b72a91020ae2/volumes" Apr 17 16:54:00.082631 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:00.082588 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:02.133557 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:02.133523 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:54:02.134012 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:02.133915 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:54:02.134012 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:02.133958 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:54:02.134382 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:02.134359 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:54:05.083036 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:05.082998 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:05.083484 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:05.083132 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:54:06.975764 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:06.975727 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:54:07.979728 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:07.979698 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:54:10.082796 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:10.082753 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:12.134497 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:12.134454 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:54:12.134902 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:12.134462 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:54:15.082630 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:15.082589 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:19.206256 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206220 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b"] Apr 17 16:54:19.206692 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206675 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kube-rbac-proxy" Apr 17 16:54:19.206692 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206693 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kube-rbac-proxy" Apr 17 16:54:19.206782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206703 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" Apr 17 16:54:19.206782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206709 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" Apr 17 16:54:19.206782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206715 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" Apr 17 16:54:19.206782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206721 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" Apr 17 16:54:19.206782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206733 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kube-rbac-proxy" Apr 17 16:54:19.206782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206738 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kube-rbac-proxy" Apr 17 16:54:19.206956 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206790 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kube-rbac-proxy" Apr 17 16:54:19.206956 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206801 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fd4eaf4-3bae-47ab-853c-dc736ae40134" containerName="kserve-container" Apr 17 16:54:19.206956 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206807 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kserve-container" Apr 17 16:54:19.206956 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.206814 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8eef08cd-81aa-4be2-9abc-b72a91020ae2" containerName="kube-rbac-proxy" Apr 17 16:54:19.211102 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.211086 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.213247 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.213229 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d0d22-serving-cert\"" Apr 17 16:54:19.213247 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.213241 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d0d22-kube-rbac-proxy-sar-config\"" Apr 17 16:54:19.219421 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.219399 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b"] Apr 17 16:54:19.318509 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.318476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c296e6d-7831-43eb-84f7-bfb00b09fee8-openshift-service-ca-bundle\") pod \"sequence-graph-d0d22-77bb574b5d-lkw4b\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.318509 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.318515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c296e6d-7831-43eb-84f7-bfb00b09fee8-proxy-tls\") pod \"sequence-graph-d0d22-77bb574b5d-lkw4b\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.419959 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.419921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c296e6d-7831-43eb-84f7-bfb00b09fee8-proxy-tls\") pod \"sequence-graph-d0d22-77bb574b5d-lkw4b\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.420139 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.420042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c296e6d-7831-43eb-84f7-bfb00b09fee8-openshift-service-ca-bundle\") pod \"sequence-graph-d0d22-77bb574b5d-lkw4b\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.420636 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.420611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c296e6d-7831-43eb-84f7-bfb00b09fee8-openshift-service-ca-bundle\") pod \"sequence-graph-d0d22-77bb574b5d-lkw4b\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.422399 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.422369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c296e6d-7831-43eb-84f7-bfb00b09fee8-proxy-tls\") pod \"sequence-graph-d0d22-77bb574b5d-lkw4b\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.524089 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.524002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:19.649324 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:19.649298 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b"] Apr 17 16:54:20.082502 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:20.082468 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:20.207568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:20.207536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" event={"ID":"7c296e6d-7831-43eb-84f7-bfb00b09fee8","Type":"ContainerStarted","Data":"8b45e5c5bb4c584219cc9129bf0f334974000c6ef68c28ea9ccac4077277046e"} Apr 17 16:54:20.207568 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:20.207571 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" event={"ID":"7c296e6d-7831-43eb-84f7-bfb00b09fee8","Type":"ContainerStarted","Data":"eda71c8c9a762cf84331337f7ed73a38dd4e34595658205f0e5edb2800111d07"} Apr 17 16:54:20.207987 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:20.207596 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:20.224904 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:20.224813 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podStartSLOduration=1.224800091 podStartE2EDuration="1.224800091s" podCreationTimestamp="2026-04-17 16:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:20.22461234 +0000 UTC m=+1393.573241894" watchObservedRunningTime="2026-04-17 16:54:20.224800091 +0000 UTC m=+1393.573429639" Apr 17 16:54:22.133984 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:22.133948 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:54:22.134359 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:22.134339 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:54:23.645096 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:23.645063 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c11182_aa87_45d2_af1e_3d53777fccea.slice/crio-23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c11182_aa87_45d2_af1e_3d53777fccea.slice/crio-conmon-23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:54:23.645470 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:23.645061 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c11182_aa87_45d2_af1e_3d53777fccea.slice/crio-23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c11182_aa87_45d2_af1e_3d53777fccea.slice/crio-27d8c07bf475b529771044ed6f77a7f55ee0f8a3e503a56aa95f419f234248f8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c11182_aa87_45d2_af1e_3d53777fccea.slice/crio-conmon-23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:54:23.774825 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.774795 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:54:23.858053 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.857960 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3c11182-aa87-45d2-af1e-3d53777fccea-openshift-service-ca-bundle\") pod \"b3c11182-aa87-45d2-af1e-3d53777fccea\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " Apr 17 16:54:23.858053 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.858015 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3c11182-aa87-45d2-af1e-3d53777fccea-proxy-tls\") pod \"b3c11182-aa87-45d2-af1e-3d53777fccea\" (UID: \"b3c11182-aa87-45d2-af1e-3d53777fccea\") " Apr 17 16:54:23.858395 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.858365 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3c11182-aa87-45d2-af1e-3d53777fccea-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b3c11182-aa87-45d2-af1e-3d53777fccea" (UID: "b3c11182-aa87-45d2-af1e-3d53777fccea"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:23.860329 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.860305 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c11182-aa87-45d2-af1e-3d53777fccea-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b3c11182-aa87-45d2-af1e-3d53777fccea" (UID: "b3c11182-aa87-45d2-af1e-3d53777fccea"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:23.959159 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.959128 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3c11182-aa87-45d2-af1e-3d53777fccea-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:23.959159 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:23.959157 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3c11182-aa87-45d2-af1e-3d53777fccea-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:24.219963 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.219920 2572 generic.go:358] "Generic (PLEG): container finished" podID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerID="23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695" exitCode=137 Apr 17 16:54:24.220154 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.219995 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" Apr 17 16:54:24.220154 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.219997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" event={"ID":"b3c11182-aa87-45d2-af1e-3d53777fccea","Type":"ContainerDied","Data":"23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695"} Apr 17 16:54:24.220154 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.220038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf" event={"ID":"b3c11182-aa87-45d2-af1e-3d53777fccea","Type":"ContainerDied","Data":"27d8c07bf475b529771044ed6f77a7f55ee0f8a3e503a56aa95f419f234248f8"} Apr 17 16:54:24.220154 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.220052 2572 scope.go:117] "RemoveContainer" containerID="23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695" Apr 17 16:54:24.228480 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.228385 2572 scope.go:117] "RemoveContainer" containerID="23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695" Apr 17 16:54:24.228914 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:24.228890 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695\": container with ID starting with 23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695 not found: ID does not exist" containerID="23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695" Apr 17 16:54:24.229004 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.228920 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695"} err="failed to get container status \"23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695\": rpc error: code = NotFound desc = could not find container \"23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695\": container with ID starting with 23ad4224df0092d0a8cfc2960705f4f2311d06aaa97471a14ed9ddb6bdbf7695 not found: ID does not exist" Apr 17 16:54:24.240463 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.240441 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf"] Apr 17 16:54:24.244767 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:24.244736 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf"] Apr 17 16:54:25.227743 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:25.227709 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" path="/var/lib/kubelet/pods/b3c11182-aa87-45d2-af1e-3d53777fccea/volumes" Apr 17 16:54:26.215882 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:26.215854 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:29.284634 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.284602 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b"] Apr 17 16:54:29.285033 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.284807 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" containerID="cri-o://8b45e5c5bb4c584219cc9129bf0f334974000c6ef68c28ea9ccac4077277046e" gracePeriod=30 Apr 17 16:54:29.401357 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.401324 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr"] Apr 17 16:54:29.401746 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.401719 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" containerID="cri-o://5a2b0fe91be070865f9fe021bda7927cd22460f70a55f91062d28bfbe507c686" gracePeriod=30 Apr 17 16:54:29.401838 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.401753 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kube-rbac-proxy" containerID="cri-o://2551ac7d78a4435f9578db6bd1a942cde897c29337b4aa71c4b48e830eaca67f" gracePeriod=30 Apr 17 16:54:29.431322 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.431296 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw"] Apr 17 16:54:29.431672 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.431643 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" Apr 17 16:54:29.431821 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.431675 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" Apr 17 16:54:29.431821 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.431737 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3c11182-aa87-45d2-af1e-3d53777fccea" containerName="ensemble-graph-ac602" Apr 17 16:54:29.437425 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.437403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.439872 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.439847 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f6018-predictor-serving-cert\"" Apr 17 16:54:29.440683 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.440640 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f6018-kube-rbac-proxy-sar-config\"" Apr 17 16:54:29.444483 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.444458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw"] Apr 17 16:54:29.473751 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.473723 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck"] Apr 17 16:54:29.474014 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.473979 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" containerID="cri-o://b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf" gracePeriod=30 Apr 17 16:54:29.474124 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.474021 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kube-rbac-proxy" containerID="cri-o://69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f" gracePeriod=30 Apr 17 16:54:29.506365 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.506339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b7f5f6d-d978-4a83-8d99-40e024607e2b-success-200-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.506486 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.506447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4hpr\" (UniqueName: \"kubernetes.io/projected/6b7f5f6d-d978-4a83-8d99-40e024607e2b-kube-api-access-w4hpr\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.506538 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.506508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.533309 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.533278 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl"] Apr 17 16:54:29.536988 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.536945 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.539299 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.539282 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f6018-predictor-serving-cert\"" Apr 17 16:54:29.539405 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.539287 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f6018-kube-rbac-proxy-sar-config\"" Apr 17 16:54:29.555068 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.555039 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl"] Apr 17 16:54:29.607339 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.607305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.607478 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.607360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b7f5f6d-d978-4a83-8d99-40e024607e2b-success-200-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.607478 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.607422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr4xl\" (UniqueName: \"kubernetes.io/projected/a7a3aa06-adda-4fdb-b593-939a28b6c765-kube-api-access-wr4xl\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.607478 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:29.607448 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-f6018-predictor-serving-cert: secret "success-200-isvc-f6018-predictor-serving-cert" not found Apr 17 16:54:29.607478 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.607469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7a3aa06-adda-4fdb-b593-939a28b6c765-error-404-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.607670 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:29.607514 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls podName:6b7f5f6d-d978-4a83-8d99-40e024607e2b nodeName:}" failed. No retries permitted until 2026-04-17 16:54:30.107493161 +0000 UTC m=+1403.456122688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls") pod "success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" (UID: "6b7f5f6d-d978-4a83-8d99-40e024607e2b") : secret "success-200-isvc-f6018-predictor-serving-cert" not found Apr 17 16:54:29.607745 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.607723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7a3aa06-adda-4fdb-b593-939a28b6c765-proxy-tls\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.607799 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.607783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4hpr\" (UniqueName: \"kubernetes.io/projected/6b7f5f6d-d978-4a83-8d99-40e024607e2b-kube-api-access-w4hpr\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.608118 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.608095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b7f5f6d-d978-4a83-8d99-40e024607e2b-success-200-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.616506 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.616482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4hpr\" (UniqueName: \"kubernetes.io/projected/6b7f5f6d-d978-4a83-8d99-40e024607e2b-kube-api-access-w4hpr\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:29.708994 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.708964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7a3aa06-adda-4fdb-b593-939a28b6c765-proxy-tls\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.709234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.709051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr4xl\" (UniqueName: \"kubernetes.io/projected/a7a3aa06-adda-4fdb-b593-939a28b6c765-kube-api-access-wr4xl\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.709234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.709083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7a3aa06-adda-4fdb-b593-939a28b6c765-error-404-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.709714 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.709692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7a3aa06-adda-4fdb-b593-939a28b6c765-error-404-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.711642 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.711619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7a3aa06-adda-4fdb-b593-939a28b6c765-proxy-tls\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.716780 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.716758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr4xl\" (UniqueName: \"kubernetes.io/projected/a7a3aa06-adda-4fdb-b593-939a28b6c765-kube-api-access-wr4xl\") pod \"error-404-isvc-f6018-predictor-6c99ffbf99-pllwl\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.856253 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.856161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:29.977027 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:29.976995 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl"] Apr 17 16:54:29.980233 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:54:29.980207 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7a3aa06_adda_4fdb_b593_939a28b6c765.slice/crio-020a3db0e026f2f1186bc4b1ad53a4dcf96ceb927109b70535b4409dd45b7ef6 WatchSource:0}: Error finding container 020a3db0e026f2f1186bc4b1ad53a4dcf96ceb927109b70535b4409dd45b7ef6: Status 404 returned error can't find the container with id 020a3db0e026f2f1186bc4b1ad53a4dcf96ceb927109b70535b4409dd45b7ef6 Apr 17 16:54:30.112618 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.112560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:30.114691 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.114669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls\") pod \"success-200-isvc-f6018-predictor-55f969cd7b-vv7hw\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:30.238639 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.238606 2572 generic.go:358] "Generic (PLEG): container finished" podID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerID="69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f" exitCode=2 Apr 17 16:54:30.238838 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.238685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" event={"ID":"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e","Type":"ContainerDied","Data":"69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f"} Apr 17 16:54:30.240050 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.240030 2572 generic.go:358] "Generic (PLEG): container finished" podID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerID="2551ac7d78a4435f9578db6bd1a942cde897c29337b4aa71c4b48e830eaca67f" exitCode=2 Apr 17 16:54:30.240142 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.240100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" event={"ID":"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb","Type":"ContainerDied","Data":"2551ac7d78a4435f9578db6bd1a942cde897c29337b4aa71c4b48e830eaca67f"} Apr 17 16:54:30.241498 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.241475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" event={"ID":"a7a3aa06-adda-4fdb-b593-939a28b6c765","Type":"ContainerStarted","Data":"014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee"} Apr 17 16:54:30.241498 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.241500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" event={"ID":"a7a3aa06-adda-4fdb-b593-939a28b6c765","Type":"ContainerStarted","Data":"3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc"} Apr 17 16:54:30.241673 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.241510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" event={"ID":"a7a3aa06-adda-4fdb-b593-939a28b6c765","Type":"ContainerStarted","Data":"020a3db0e026f2f1186bc4b1ad53a4dcf96ceb927109b70535b4409dd45b7ef6"} Apr 17 16:54:30.241673 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.241625 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:30.260398 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.260349 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podStartSLOduration=1.260336139 podStartE2EDuration="1.260336139s" podCreationTimestamp="2026-04-17 16:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:30.258405907 +0000 UTC m=+1403.607035457" watchObservedRunningTime="2026-04-17 16:54:30.260336139 +0000 UTC m=+1403.608965690" Apr 17 16:54:30.351695 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.351636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:30.473733 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:30.473698 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw"] Apr 17 16:54:30.476687 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:54:30.476639 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b7f5f6d_d978_4a83_8d99_40e024607e2b.slice/crio-172e7de9ce66100ac4847d08827e4e5095a9543041c93ea2f0d781859f6addac WatchSource:0}: Error finding container 172e7de9ce66100ac4847d08827e4e5095a9543041c93ea2f0d781859f6addac: Status 404 returned error can't find the container with id 172e7de9ce66100ac4847d08827e4e5095a9543041c93ea2f0d781859f6addac Apr 17 16:54:31.214565 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.214527 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:31.246503 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.246467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" event={"ID":"6b7f5f6d-d978-4a83-8d99-40e024607e2b","Type":"ContainerStarted","Data":"b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598"} Apr 17 16:54:31.246728 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.246510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" event={"ID":"6b7f5f6d-d978-4a83-8d99-40e024607e2b","Type":"ContainerStarted","Data":"a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d"} Apr 17 16:54:31.246728 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.246532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" event={"ID":"6b7f5f6d-d978-4a83-8d99-40e024607e2b","Type":"ContainerStarted","Data":"172e7de9ce66100ac4847d08827e4e5095a9543041c93ea2f0d781859f6addac"} Apr 17 16:54:31.246728 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.246674 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:31.246728 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.246707 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:31.248048 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.248020 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 16:54:31.264448 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.264397 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podStartSLOduration=2.264378562 podStartE2EDuration="2.264378562s" podCreationTimestamp="2026-04-17 16:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:31.262547323 +0000 UTC m=+1404.611176872" watchObservedRunningTime="2026-04-17 16:54:31.264378562 +0000 UTC m=+1404.613008111" Apr 17 16:54:31.968967 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:31.968919 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 17 16:54:32.134033 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:32.133991 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 16:54:32.134424 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:32.134400 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:54:32.250441 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:32.250336 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:32.250441 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:32.250386 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 16:54:32.251719 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:32.251684 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 16:54:32.924510 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:32.924479 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:54:33.038545 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.038451 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls\") pod \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " Apr 17 16:54:33.038545 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.038546 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvxtj\" (UniqueName: \"kubernetes.io/projected/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-kube-api-access-wvxtj\") pod \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " Apr 17 16:54:33.039071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.038585 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-error-404-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\" (UID: \"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e\") " Apr 17 16:54:33.039071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.038912 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-error-404-isvc-d0d22-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d0d22-kube-rbac-proxy-sar-config") pod "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" (UID: "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e"). InnerVolumeSpecName "error-404-isvc-d0d22-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:33.040573 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.040553 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" (UID: "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:33.040573 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.040560 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-kube-api-access-wvxtj" (OuterVolumeSpecName: "kube-api-access-wvxtj") pod "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" (UID: "e0f6cda1-d31a-47bf-bca2-34508ccf7d7e"). InnerVolumeSpecName "kube-api-access-wvxtj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:33.139225 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.139193 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:33.139225 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.139224 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvxtj\" (UniqueName: \"kubernetes.io/projected/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-kube-api-access-wvxtj\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:33.139376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.139240 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e-error-404-isvc-d0d22-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:33.255213 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.255181 2572 generic.go:358] "Generic (PLEG): container finished" podID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerID="5a2b0fe91be070865f9fe021bda7927cd22460f70a55f91062d28bfbe507c686" exitCode=0 Apr 17 16:54:33.255377 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.255316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" event={"ID":"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb","Type":"ContainerDied","Data":"5a2b0fe91be070865f9fe021bda7927cd22460f70a55f91062d28bfbe507c686"} Apr 17 16:54:33.256700 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.256674 2572 generic.go:358] "Generic (PLEG): container finished" podID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerID="b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf" exitCode=0 Apr 17 16:54:33.256833 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.256757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" event={"ID":"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e","Type":"ContainerDied","Data":"b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf"} Apr 17 16:54:33.256833 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.256785 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" Apr 17 16:54:33.256833 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.256798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck" event={"ID":"e0f6cda1-d31a-47bf-bca2-34508ccf7d7e","Type":"ContainerDied","Data":"1fc88a91bce45a6f8b59d1a56c1aa909828e8ec3e5672719b46eb019f2d31029"} Apr 17 16:54:33.256833 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.256818 2572 scope.go:117] "RemoveContainer" containerID="69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f" Apr 17 16:54:33.257382 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.257358 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 16:54:33.264635 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.264613 2572 scope.go:117] "RemoveContainer" containerID="b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf" Apr 17 16:54:33.271882 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.271853 2572 scope.go:117] "RemoveContainer" containerID="69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f" Apr 17 16:54:33.272151 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:33.272130 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f\": container with ID starting with 69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f not found: ID does not exist" containerID="69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f" Apr 17 16:54:33.272223 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.272160 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f"} err="failed to get container status \"69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f\": rpc error: code = NotFound desc = could not find container \"69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f\": container with ID starting with 69036c0542eec94e6d45ea1966c744e427a419c48ab5a7c4035f45842b8ba64f not found: ID does not exist" Apr 17 16:54:33.272223 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.272179 2572 scope.go:117] "RemoveContainer" containerID="b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf" Apr 17 16:54:33.272414 ip-10-0-138-137 kubenswrapper[2572]: E0417 16:54:33.272398 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf\": container with ID starting with b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf not found: ID does not exist" containerID="b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf" Apr 17 16:54:33.272461 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.272418 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf"} err="failed to get container status \"b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf\": rpc error: code = NotFound desc = could not find container \"b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf\": container with ID starting with b3d78d1801b9c32ec29ae00067862253a0b8e7f5089af6d58bd91290a52825bf not found: ID does not exist" Apr 17 16:54:33.275008 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.274984 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck"] Apr 17 16:54:33.278794 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.278774 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck"] Apr 17 16:54:33.338058 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.338036 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:54:33.441337 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.441301 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d2wx\" (UniqueName: \"kubernetes.io/projected/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-kube-api-access-8d2wx\") pod \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " Apr 17 16:54:33.441505 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.441374 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-proxy-tls\") pod \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " Apr 17 16:54:33.441505 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.441397 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-success-200-isvc-d0d22-kube-rbac-proxy-sar-config\") pod \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\" (UID: \"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb\") " Apr 17 16:54:33.441782 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.441754 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-success-200-isvc-d0d22-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d0d22-kube-rbac-proxy-sar-config") pod "a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" (UID: "a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb"). InnerVolumeSpecName "success-200-isvc-d0d22-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:33.443350 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.443330 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-kube-api-access-8d2wx" (OuterVolumeSpecName: "kube-api-access-8d2wx") pod "a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" (UID: "a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb"). InnerVolumeSpecName "kube-api-access-8d2wx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:33.443431 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.443359 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" (UID: "a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:33.542208 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.542167 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:33.542208 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.542202 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d0d22-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-success-200-isvc-d0d22-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:33.542208 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:33.542214 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8d2wx\" (UniqueName: \"kubernetes.io/projected/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb-kube-api-access-8d2wx\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:34.261248 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:34.261219 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" Apr 17 16:54:34.261717 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:34.261218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr" event={"ID":"a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb","Type":"ContainerDied","Data":"a628befeb1a8519cf28afcb784a8818b99e82c1a9b39dcd1501dd3cdd9beb645"} Apr 17 16:54:34.261717 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:34.261351 2572 scope.go:117] "RemoveContainer" containerID="2551ac7d78a4435f9578db6bd1a942cde897c29337b4aa71c4b48e830eaca67f" Apr 17 16:54:34.270142 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:34.270125 2572 scope.go:117] "RemoveContainer" containerID="5a2b0fe91be070865f9fe021bda7927cd22460f70a55f91062d28bfbe507c686" Apr 17 16:54:34.283059 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:34.283037 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr"] Apr 17 16:54:34.286962 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:34.286944 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr"] Apr 17 16:54:35.228433 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:35.228381 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" path="/var/lib/kubelet/pods/a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb/volumes" Apr 17 16:54:35.228938 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:35.228920 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" path="/var/lib/kubelet/pods/e0f6cda1-d31a-47bf-bca2-34508ccf7d7e/volumes" Apr 17 16:54:36.214234 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:36.214195 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:37.254946 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:37.254921 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:54:37.255507 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:37.255480 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 16:54:38.261797 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:38.261771 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:54:38.262403 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:38.262379 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 16:54:41.214094 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:41.214058 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:41.214552 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:41.214164 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:42.134385 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:42.134341 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 17 16:54:42.134790 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:42.134770 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 16:54:46.214492 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:46.214453 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:47.255687 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:47.255630 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 16:54:48.262558 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:48.262518 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 16:54:51.214710 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:51.214673 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:52.135493 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:52.135463 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 16:54:56.215325 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:56.215235 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:54:57.256204 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:57.256169 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 16:54:58.262725 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:58.262686 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 16:54:59.336458 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.336420 2572 generic.go:358] "Generic (PLEG): container finished" podID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerID="8b45e5c5bb4c584219cc9129bf0f334974000c6ef68c28ea9ccac4077277046e" exitCode=0 Apr 17 16:54:59.336836 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.336486 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" event={"ID":"7c296e6d-7831-43eb-84f7-bfb00b09fee8","Type":"ContainerDied","Data":"8b45e5c5bb4c584219cc9129bf0f334974000c6ef68c28ea9ccac4077277046e"} Apr 17 16:54:59.423740 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.423718 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:54:59.461917 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.461890 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c296e6d-7831-43eb-84f7-bfb00b09fee8-openshift-service-ca-bundle\") pod \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " Apr 17 16:54:59.462079 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.461958 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c296e6d-7831-43eb-84f7-bfb00b09fee8-proxy-tls\") pod \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\" (UID: \"7c296e6d-7831-43eb-84f7-bfb00b09fee8\") " Apr 17 16:54:59.462291 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.462264 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c296e6d-7831-43eb-84f7-bfb00b09fee8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7c296e6d-7831-43eb-84f7-bfb00b09fee8" (UID: "7c296e6d-7831-43eb-84f7-bfb00b09fee8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:59.463887 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.463867 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c296e6d-7831-43eb-84f7-bfb00b09fee8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7c296e6d-7831-43eb-84f7-bfb00b09fee8" (UID: "7c296e6d-7831-43eb-84f7-bfb00b09fee8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:59.562585 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.562510 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c296e6d-7831-43eb-84f7-bfb00b09fee8-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:54:59.562585 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:54:59.562540 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c296e6d-7831-43eb-84f7-bfb00b09fee8-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 16:55:00.340336 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:00.340302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" event={"ID":"7c296e6d-7831-43eb-84f7-bfb00b09fee8","Type":"ContainerDied","Data":"eda71c8c9a762cf84331337f7ed73a38dd4e34595658205f0e5edb2800111d07"} Apr 17 16:55:00.340336 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:00.340327 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b" Apr 17 16:55:00.340336 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:00.340344 2572 scope.go:117] "RemoveContainer" containerID="8b45e5c5bb4c584219cc9129bf0f334974000c6ef68c28ea9ccac4077277046e" Apr 17 16:55:00.360831 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:00.360802 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b"] Apr 17 16:55:00.363589 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:00.363569 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b"] Apr 17 16:55:01.227830 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:01.227791 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" path="/var/lib/kubelet/pods/7c296e6d-7831-43eb-84f7-bfb00b09fee8/volumes" Apr 17 16:55:03.817446 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817412 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96"] Apr 17 16:55:03.817932 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817915 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kube-rbac-proxy" Apr 17 16:55:03.817978 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817935 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kube-rbac-proxy" Apr 17 16:55:03.817978 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817962 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" Apr 17 16:55:03.817978 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817971 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" Apr 17 16:55:03.818071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817982 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kube-rbac-proxy" Apr 17 16:55:03.818071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817991 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kube-rbac-proxy" Apr 17 16:55:03.818071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.817999 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" Apr 17 16:55:03.818071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818007 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" Apr 17 16:55:03.818071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818020 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" Apr 17 16:55:03.818071 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818028 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" Apr 17 16:55:03.818331 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818097 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kube-rbac-proxy" Apr 17 16:55:03.818331 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818110 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f8ebf7-b46f-4015-b8d2-49f38ee1dceb" containerName="kserve-container" Apr 17 16:55:03.818331 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818118 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kube-rbac-proxy" Apr 17 16:55:03.818331 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818128 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c296e6d-7831-43eb-84f7-bfb00b09fee8" containerName="sequence-graph-d0d22" Apr 17 16:55:03.818331 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.818138 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0f6cda1-d31a-47bf-bca2-34508ccf7d7e" containerName="kserve-container" Apr 17 16:55:03.822813 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.822796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:03.825150 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.825125 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-c68dd-kube-rbac-proxy-sar-config\"" Apr 17 16:55:03.825391 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.825368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-c68dd-serving-cert\"" Apr 17 16:55:03.828429 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.828409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96"] Apr 17 16:55:03.894115 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.894090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b45c30-55a8-411a-9555-1cfce612409c-openshift-service-ca-bundle\") pod \"ensemble-graph-c68dd-c6797d99d-xsj96\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:03.894240 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.894125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b45c30-55a8-411a-9555-1cfce612409c-proxy-tls\") pod \"ensemble-graph-c68dd-c6797d99d-xsj96\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:03.994923 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.994886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b45c30-55a8-411a-9555-1cfce612409c-openshift-service-ca-bundle\") pod \"ensemble-graph-c68dd-c6797d99d-xsj96\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:03.994923 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.994929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b45c30-55a8-411a-9555-1cfce612409c-proxy-tls\") pod \"ensemble-graph-c68dd-c6797d99d-xsj96\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:03.995595 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.995574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b45c30-55a8-411a-9555-1cfce612409c-openshift-service-ca-bundle\") pod \"ensemble-graph-c68dd-c6797d99d-xsj96\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:03.997376 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:03.997351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b45c30-55a8-411a-9555-1cfce612409c-proxy-tls\") pod \"ensemble-graph-c68dd-c6797d99d-xsj96\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:04.135258 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:04.135171 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:04.272931 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:04.272899 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96"] Apr 17 16:55:04.275938 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:55:04.275903 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b45c30_55a8_411a_9555_1cfce612409c.slice/crio-ec4789d98900a954de4199f32f412a8a5845a1d842ed3c4165e99c64a8033564 WatchSource:0}: Error finding container ec4789d98900a954de4199f32f412a8a5845a1d842ed3c4165e99c64a8033564: Status 404 returned error can't find the container with id ec4789d98900a954de4199f32f412a8a5845a1d842ed3c4165e99c64a8033564 Apr 17 16:55:04.355800 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:04.355763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" event={"ID":"88b45c30-55a8-411a-9555-1cfce612409c","Type":"ContainerStarted","Data":"c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355"} Apr 17 16:55:04.355800 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:04.355804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" event={"ID":"88b45c30-55a8-411a-9555-1cfce612409c","Type":"ContainerStarted","Data":"ec4789d98900a954de4199f32f412a8a5845a1d842ed3c4165e99c64a8033564"} Apr 17 16:55:04.355983 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:04.355832 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:04.372759 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:04.372713 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podStartSLOduration=1.372696935 podStartE2EDuration="1.372696935s" podCreationTimestamp="2026-04-17 16:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:04.371227427 +0000 UTC m=+1437.719856977" watchObservedRunningTime="2026-04-17 16:55:04.372696935 +0000 UTC m=+1437.721326490" Apr 17 16:55:07.255878 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:07.255842 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 16:55:08.262453 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:08.262409 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 16:55:10.364232 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:10.364200 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 16:55:17.256779 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:17.256747 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 16:55:18.263255 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:18.263227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 16:55:29.495278 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.495236 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg"] Apr 17 16:55:29.499904 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.499880 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.502486 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.502460 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f6018-serving-cert\"" Apr 17 16:55:29.502578 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.502460 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f6018-kube-rbac-proxy-sar-config\"" Apr 17 16:55:29.508445 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.508415 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg"] Apr 17 16:55:29.589593 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.589553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0c2def3-ae1a-4e26-9457-c3814b0f6898-openshift-service-ca-bundle\") pod \"sequence-graph-f6018-777885f489-bxxmg\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.589791 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.589618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c2def3-ae1a-4e26-9457-c3814b0f6898-proxy-tls\") pod \"sequence-graph-f6018-777885f489-bxxmg\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.690870 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.690834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0c2def3-ae1a-4e26-9457-c3814b0f6898-openshift-service-ca-bundle\") pod \"sequence-graph-f6018-777885f489-bxxmg\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.691024 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.690897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c2def3-ae1a-4e26-9457-c3814b0f6898-proxy-tls\") pod \"sequence-graph-f6018-777885f489-bxxmg\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.691464 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.691436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0c2def3-ae1a-4e26-9457-c3814b0f6898-openshift-service-ca-bundle\") pod \"sequence-graph-f6018-777885f489-bxxmg\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.693140 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.693119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c2def3-ae1a-4e26-9457-c3814b0f6898-proxy-tls\") pod \"sequence-graph-f6018-777885f489-bxxmg\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.811496 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.811404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:29.933266 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:29.933233 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg"] Apr 17 16:55:29.936297 ip-10-0-138-137 kubenswrapper[2572]: W0417 16:55:29.936267 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c2def3_ae1a_4e26_9457_c3814b0f6898.slice/crio-d9a7463b128f6cb702dd3e440b7eba4f9e6ccd38ef0851631451d71f40dfcdc4 WatchSource:0}: Error finding container d9a7463b128f6cb702dd3e440b7eba4f9e6ccd38ef0851631451d71f40dfcdc4: Status 404 returned error can't find the container with id d9a7463b128f6cb702dd3e440b7eba4f9e6ccd38ef0851631451d71f40dfcdc4 Apr 17 16:55:30.433886 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:30.433849 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" event={"ID":"d0c2def3-ae1a-4e26-9457-c3814b0f6898","Type":"ContainerStarted","Data":"146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407"} Apr 17 16:55:30.433886 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:30.433885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" event={"ID":"d0c2def3-ae1a-4e26-9457-c3814b0f6898","Type":"ContainerStarted","Data":"d9a7463b128f6cb702dd3e440b7eba4f9e6ccd38ef0851631451d71f40dfcdc4"} Apr 17 16:55:30.434096 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:30.433976 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 16:55:30.449299 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:30.449249 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podStartSLOduration=1.449235168 podStartE2EDuration="1.449235168s" podCreationTimestamp="2026-04-17 16:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:30.448087788 +0000 UTC m=+1463.796717337" watchObservedRunningTime="2026-04-17 16:55:30.449235168 +0000 UTC m=+1463.797864717" Apr 17 16:55:36.442100 ip-10-0-138-137 kubenswrapper[2572]: I0417 16:55:36.442069 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 17:03:18.637353 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.637318 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96"] Apr 17 17:03:18.639710 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.637560 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" containerID="cri-o://c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355" gracePeriod=30 Apr 17 17:03:18.757876 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.757842 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6"] Apr 17 17:03:18.758155 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.758112 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" containerID="cri-o://0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e" gracePeriod=30 Apr 17 17:03:18.758155 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.758137 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kube-rbac-proxy" containerID="cri-o://27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff" gracePeriod=30 Apr 17 17:03:18.843549 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.843518 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp"] Apr 17 17:03:18.846970 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.846948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.849401 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.849377 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-41c4d-predictor-serving-cert\"" Apr 17 17:03:18.849563 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.849525 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-41c4d-kube-rbac-proxy-sar-config\"" Apr 17 17:03:18.859928 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.859904 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq"] Apr 17 17:03:18.860247 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.860222 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" containerID="cri-o://1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e" gracePeriod=30 Apr 17 17:03:18.860329 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.860295 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kube-rbac-proxy" containerID="cri-o://e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7" gracePeriod=30 Apr 17 17:03:18.869712 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.868546 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp"] Apr 17 17:03:18.895422 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.895364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxnd\" (UniqueName: \"kubernetes.io/projected/286f4108-4887-46df-87ae-8c5cdb92a6e6-kube-api-access-kcxnd\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.895534 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.895431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/286f4108-4887-46df-87ae-8c5cdb92a6e6-success-200-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.895534 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.895508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/286f4108-4887-46df-87ae-8c5cdb92a6e6-proxy-tls\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.916405 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.916380 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb"] Apr 17 17:03:18.919713 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.919696 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:18.922122 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.922100 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-41c4d-predictor-serving-cert\"" Apr 17 17:03:18.922204 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.922150 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-41c4d-kube-rbac-proxy-sar-config\"" Apr 17 17:03:18.930907 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.930885 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb"] Apr 17 17:03:18.996321 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.996240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2btb\" (UniqueName: \"kubernetes.io/projected/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-kube-api-access-g2btb\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:18.996321 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.996292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-proxy-tls\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:18.996321 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.996319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-error-404-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:18.996573 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.996350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxnd\" (UniqueName: \"kubernetes.io/projected/286f4108-4887-46df-87ae-8c5cdb92a6e6-kube-api-access-kcxnd\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.996573 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.996455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/286f4108-4887-46df-87ae-8c5cdb92a6e6-success-200-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.996573 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.996508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/286f4108-4887-46df-87ae-8c5cdb92a6e6-proxy-tls\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.997227 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.997204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/286f4108-4887-46df-87ae-8c5cdb92a6e6-success-200-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:18.998713 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:18.998694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/286f4108-4887-46df-87ae-8c5cdb92a6e6-proxy-tls\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:19.009418 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.009397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxnd\" (UniqueName: \"kubernetes.io/projected/286f4108-4887-46df-87ae-8c5cdb92a6e6-kube-api-access-kcxnd\") pod \"success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:19.097922 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.097887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2btb\" (UniqueName: \"kubernetes.io/projected/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-kube-api-access-g2btb\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.098093 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.097928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-proxy-tls\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.098093 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.097952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-error-404-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.098744 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.098719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-error-404-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.100344 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.100320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-proxy-tls\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.107526 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.107504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2btb\" (UniqueName: \"kubernetes.io/projected/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-kube-api-access-g2btb\") pod \"error-404-isvc-41c4d-predictor-ffc88656f-4gvqb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.159666 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.159558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:19.232587 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.232554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.280551 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.280518 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp"] Apr 17 17:03:19.284493 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:03:19.284431 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod286f4108_4887_46df_87ae_8c5cdb92a6e6.slice/crio-33cd43eaca7f73cadb60c51c525f0c48583c96eb3850d128e1028501eac194e7 WatchSource:0}: Error finding container 33cd43eaca7f73cadb60c51c525f0c48583c96eb3850d128e1028501eac194e7: Status 404 returned error can't find the container with id 33cd43eaca7f73cadb60c51c525f0c48583c96eb3850d128e1028501eac194e7 Apr 17 17:03:19.286547 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.286521 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:03:19.358630 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.358610 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb"] Apr 17 17:03:19.360933 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:03:19.360907 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ecf353_5c1a_48eb_8bd5_66b1a725f4bb.slice/crio-910f64eea9f4cb5a7b84dd6689410f20645f07d3422adfaba60a24d2496a8160 WatchSource:0}: Error finding container 910f64eea9f4cb5a7b84dd6689410f20645f07d3422adfaba60a24d2496a8160: Status 404 returned error can't find the container with id 910f64eea9f4cb5a7b84dd6689410f20645f07d3422adfaba60a24d2496a8160 Apr 17 17:03:19.762344 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.762299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" event={"ID":"286f4108-4887-46df-87ae-8c5cdb92a6e6","Type":"ContainerStarted","Data":"c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda"} Apr 17 17:03:19.762848 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.762348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" event={"ID":"286f4108-4887-46df-87ae-8c5cdb92a6e6","Type":"ContainerStarted","Data":"2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727"} Apr 17 17:03:19.762848 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.762370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" event={"ID":"286f4108-4887-46df-87ae-8c5cdb92a6e6","Type":"ContainerStarted","Data":"33cd43eaca7f73cadb60c51c525f0c48583c96eb3850d128e1028501eac194e7"} Apr 17 17:03:19.762848 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.762440 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:19.764005 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.763978 2572 generic.go:358] "Generic (PLEG): container finished" podID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerID="27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff" exitCode=2 Apr 17 17:03:19.764133 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.764035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" event={"ID":"b10bfd64-2fd9-43bd-9507-303d487de5a1","Type":"ContainerDied","Data":"27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff"} Apr 17 17:03:19.765537 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.765512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" event={"ID":"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb","Type":"ContainerStarted","Data":"a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a"} Apr 17 17:03:19.765754 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.765543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" event={"ID":"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb","Type":"ContainerStarted","Data":"7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636"} Apr 17 17:03:19.765754 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.765556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" event={"ID":"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb","Type":"ContainerStarted","Data":"910f64eea9f4cb5a7b84dd6689410f20645f07d3422adfaba60a24d2496a8160"} Apr 17 17:03:19.765754 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.765581 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:19.767080 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.767060 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerID="e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7" exitCode=2 Apr 17 17:03:19.767206 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.767107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" event={"ID":"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1","Type":"ContainerDied","Data":"e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7"} Apr 17 17:03:19.782166 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.782122 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podStartSLOduration=1.7821102290000002 podStartE2EDuration="1.782110229s" podCreationTimestamp="2026-04-17 17:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:03:19.780622504 +0000 UTC m=+1933.129252054" watchObservedRunningTime="2026-04-17 17:03:19.782110229 +0000 UTC m=+1933.130739777" Apr 17 17:03:19.799186 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:19.799140 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podStartSLOduration=1.799126338 podStartE2EDuration="1.799126338s" podCreationTimestamp="2026-04-17 17:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:03:19.798784578 +0000 UTC m=+1933.147414127" watchObservedRunningTime="2026-04-17 17:03:19.799126338 +0000 UTC m=+1933.147755887" Apr 17 17:03:20.363017 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:20.362979 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:20.770609 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:20.770578 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:20.770998 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:20.770619 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:20.771679 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:20.771632 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:03:20.771818 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:20.771721 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 17:03:21.774103 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:21.774056 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:03:21.774532 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:21.774130 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 17:03:22.129856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.129482 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 17 17:03:22.134799 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.134761 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 17 17:03:22.205681 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.205638 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 17:03:22.298139 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.298119 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 17:03:22.326785 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.326757 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b10bfd64-2fd9-43bd-9507-303d487de5a1-proxy-tls\") pod \"b10bfd64-2fd9-43bd-9507-303d487de5a1\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " Apr 17 17:03:22.326933 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.326804 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbqw2\" (UniqueName: \"kubernetes.io/projected/b10bfd64-2fd9-43bd-9507-303d487de5a1-kube-api-access-wbqw2\") pod \"b10bfd64-2fd9-43bd-9507-303d487de5a1\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " Apr 17 17:03:22.326933 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.326877 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b10bfd64-2fd9-43bd-9507-303d487de5a1-success-200-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"b10bfd64-2fd9-43bd-9507-303d487de5a1\" (UID: \"b10bfd64-2fd9-43bd-9507-303d487de5a1\") " Apr 17 17:03:22.328201 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.327282 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10bfd64-2fd9-43bd-9507-303d487de5a1-success-200-isvc-c68dd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-c68dd-kube-rbac-proxy-sar-config") pod "b10bfd64-2fd9-43bd-9507-303d487de5a1" (UID: "b10bfd64-2fd9-43bd-9507-303d487de5a1"). InnerVolumeSpecName "success-200-isvc-c68dd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:03:22.331677 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.329222 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b10bfd64-2fd9-43bd-9507-303d487de5a1-success-200-isvc-c68dd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:22.334455 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.334340 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10bfd64-2fd9-43bd-9507-303d487de5a1-kube-api-access-wbqw2" (OuterVolumeSpecName: "kube-api-access-wbqw2") pod "b10bfd64-2fd9-43bd-9507-303d487de5a1" (UID: "b10bfd64-2fd9-43bd-9507-303d487de5a1"). InnerVolumeSpecName "kube-api-access-wbqw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:03:22.335082 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.335057 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10bfd64-2fd9-43bd-9507-303d487de5a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b10bfd64-2fd9-43bd-9507-303d487de5a1" (UID: "b10bfd64-2fd9-43bd-9507-303d487de5a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:03:22.430116 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.430079 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-error-404-isvc-c68dd-kube-rbac-proxy-sar-config\") pod \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " Apr 17 17:03:22.430259 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.430165 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7scb\" (UniqueName: \"kubernetes.io/projected/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-kube-api-access-w7scb\") pod \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " Apr 17 17:03:22.430259 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.430198 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-proxy-tls\") pod \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\" (UID: \"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1\") " Apr 17 17:03:22.430405 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.430386 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b10bfd64-2fd9-43bd-9507-303d487de5a1-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:22.430460 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.430411 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbqw2\" (UniqueName: \"kubernetes.io/projected/b10bfd64-2fd9-43bd-9507-303d487de5a1-kube-api-access-wbqw2\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:22.430501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.430459 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-error-404-isvc-c68dd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-c68dd-kube-rbac-proxy-sar-config") pod "bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" (UID: "bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1"). InnerVolumeSpecName "error-404-isvc-c68dd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:03:22.432155 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.432126 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-kube-api-access-w7scb" (OuterVolumeSpecName: "kube-api-access-w7scb") pod "bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" (UID: "bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1"). InnerVolumeSpecName "kube-api-access-w7scb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:03:22.432239 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.432187 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" (UID: "bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:03:22.531572 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.531537 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w7scb\" (UniqueName: \"kubernetes.io/projected/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-kube-api-access-w7scb\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:22.531572 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.531564 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:22.531572 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.531574 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-c68dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1-error-404-isvc-c68dd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:22.777411 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.777373 2572 generic.go:358] "Generic (PLEG): container finished" podID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerID="0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e" exitCode=0 Apr 17 17:03:22.777868 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.777463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" event={"ID":"b10bfd64-2fd9-43bd-9507-303d487de5a1","Type":"ContainerDied","Data":"0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e"} Apr 17 17:03:22.777868 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.777473 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" Apr 17 17:03:22.777868 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.777498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" event={"ID":"b10bfd64-2fd9-43bd-9507-303d487de5a1","Type":"ContainerDied","Data":"ba2126324612064d940a558216f12eb2d011cdbf38d67d71fe16839aec9f59b5"} Apr 17 17:03:22.777868 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.777517 2572 scope.go:117] "RemoveContainer" containerID="27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff" Apr 17 17:03:22.778961 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.778940 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerID="1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e" exitCode=0 Apr 17 17:03:22.779071 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.779036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" event={"ID":"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1","Type":"ContainerDied","Data":"1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e"} Apr 17 17:03:22.779071 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.779057 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" Apr 17 17:03:22.779162 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.779071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq" event={"ID":"bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1","Type":"ContainerDied","Data":"f496bf844e1ebc30d9b761f90373f776d6d8e62951efac19c72dfc77f5c21031"} Apr 17 17:03:22.786155 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.786129 2572 scope.go:117] "RemoveContainer" containerID="0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e" Apr 17 17:03:22.793621 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.793602 2572 scope.go:117] "RemoveContainer" containerID="27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff" Apr 17 17:03:22.793901 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:22.793884 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff\": container with ID starting with 27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff not found: ID does not exist" containerID="27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff" Apr 17 17:03:22.793972 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.793909 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff"} err="failed to get container status \"27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff\": rpc error: code = NotFound desc = could not find container \"27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff\": container with ID starting with 27800c5e3088762427db1057d8a7c11536d7f11863973cce20e402ffac6e24ff not found: ID does not exist" Apr 17 17:03:22.793972 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.793926 2572 scope.go:117] "RemoveContainer" containerID="0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e" Apr 17 17:03:22.794156 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:22.794137 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e\": container with ID starting with 0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e not found: ID does not exist" containerID="0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e" Apr 17 17:03:22.794201 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.794162 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e"} err="failed to get container status \"0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e\": rpc error: code = NotFound desc = could not find container \"0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e\": container with ID starting with 0ce2802119bdfaef0c1f8a8614f61a13c2a94405f9a786e7d99a032de4e33f6e not found: ID does not exist" Apr 17 17:03:22.794201 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.794177 2572 scope.go:117] "RemoveContainer" containerID="e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7" Apr 17 17:03:22.801697 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.801683 2572 scope.go:117] "RemoveContainer" containerID="1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e" Apr 17 17:03:22.804602 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.804572 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq"] Apr 17 17:03:22.808887 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.808872 2572 scope.go:117] "RemoveContainer" containerID="e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7" Apr 17 17:03:22.809189 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:22.809164 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7\": container with ID starting with e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7 not found: ID does not exist" containerID="e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7" Apr 17 17:03:22.809268 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.809195 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7"} err="failed to get container status \"e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7\": rpc error: code = NotFound desc = could not find container \"e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7\": container with ID starting with e84628da16b08f9f65fb787a7dd44bf1ce516c73d3cd7416064fbb97c6c254a7 not found: ID does not exist" Apr 17 17:03:22.809268 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.809212 2572 scope.go:117] "RemoveContainer" containerID="1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e" Apr 17 17:03:22.809512 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:22.809490 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e\": container with ID starting with 1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e not found: ID does not exist" containerID="1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e" Apr 17 17:03:22.809585 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.809521 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e"} err="failed to get container status \"1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e\": rpc error: code = NotFound desc = could not find container \"1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e\": container with ID starting with 1b5cfed0c30b3425e49c23e3336a67241f99b2f06a521c289b58aeaf0091a43e not found: ID does not exist" Apr 17 17:03:22.810483 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.810466 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq"] Apr 17 17:03:22.822551 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.822528 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6"] Apr 17 17:03:22.828747 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:22.828729 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6"] Apr 17 17:03:23.129313 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:23.129271 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 17 17:03:23.135449 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:23.135419 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: i/o timeout" Apr 17 17:03:23.226166 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:23.226133 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" path="/var/lib/kubelet/pods/b10bfd64-2fd9-43bd-9507-303d487de5a1/volumes" Apr 17 17:03:23.226547 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:23.226534 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" path="/var/lib/kubelet/pods/bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1/volumes" Apr 17 17:03:25.362539 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:25.362506 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:26.778716 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:26.778682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:03:26.779231 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:26.778888 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:03:26.779231 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:26.779089 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:03:26.779428 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:26.779408 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 17:03:30.363262 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:30.363225 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:30.363757 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:30.363346 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 17:03:35.362511 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:35.362475 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:36.779691 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:36.779641 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:03:36.780112 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:36.779638 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 17:03:40.363378 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:40.363329 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:44.193059 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.193023 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg"] Apr 17 17:03:44.193411 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.193261 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" containerID="cri-o://146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407" gracePeriod=30 Apr 17 17:03:44.266693 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.266643 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl"] Apr 17 17:03:44.266982 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.266955 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" containerID="cri-o://3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc" gracePeriod=30 Apr 17 17:03:44.267075 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.267014 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kube-rbac-proxy" containerID="cri-o://014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee" gracePeriod=30 Apr 17 17:03:44.318894 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.318859 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw"] Apr 17 17:03:44.319230 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.319203 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" containerID="cri-o://a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d" gracePeriod=30 Apr 17 17:03:44.319333 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.319264 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kube-rbac-proxy" containerID="cri-o://b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598" gracePeriod=30 Apr 17 17:03:44.353493 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353461 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz"] Apr 17 17:03:44.353820 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353806 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" Apr 17 17:03:44.353820 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353820 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353831 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kube-rbac-proxy" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353837 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kube-rbac-proxy" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353855 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353860 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353867 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kube-rbac-proxy" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353874 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kube-rbac-proxy" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353920 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kserve-container" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353929 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kserve-container" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353939 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b10bfd64-2fd9-43bd-9507-303d487de5a1" containerName="kube-rbac-proxy" Apr 17 17:03:44.353951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.353945 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfd7ff5e-fc98-4023-8ffc-c410bedc4ae1" containerName="kube-rbac-proxy" Apr 17 17:03:44.357059 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.357035 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.359474 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.359445 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6941b-predictor-serving-cert\"" Apr 17 17:03:44.359576 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.359479 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6941b-kube-rbac-proxy-sar-config\"" Apr 17 17:03:44.369342 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.369324 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz"] Apr 17 17:03:44.459791 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.459761 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424"] Apr 17 17:03:44.463085 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.463068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.465533 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.465513 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6941b-predictor-serving-cert\"" Apr 17 17:03:44.465678 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.465589 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6941b-kube-rbac-proxy-sar-config\"" Apr 17 17:03:44.474929 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.474905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424"] Apr 17 17:03:44.504546 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.504521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.504709 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.504553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b227b01e-aa15-4737-8813-9e1a7021a599-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.504709 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.504578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9g5\" (UniqueName: \"kubernetes.io/projected/b227b01e-aa15-4737-8813-9e1a7021a599-kube-api-access-nl9g5\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.606125 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtjz\" (UniqueName: \"kubernetes.io/projected/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-kube-api-access-7qtjz\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.606269 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.606269 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.606269 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b227b01e-aa15-4737-8813-9e1a7021a599-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.606269 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9g5\" (UniqueName: \"kubernetes.io/projected/b227b01e-aa15-4737-8813-9e1a7021a599-kube-api-access-nl9g5\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.606480 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.606480 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:44.606313 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-6941b-predictor-serving-cert: secret "success-200-isvc-6941b-predictor-serving-cert" not found Apr 17 17:03:44.606480 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:44.606397 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls podName:b227b01e-aa15-4737-8813-9e1a7021a599 nodeName:}" failed. No retries permitted until 2026-04-17 17:03:45.106373348 +0000 UTC m=+1958.455002877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls") pod "success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" (UID: "b227b01e-aa15-4737-8813-9e1a7021a599") : secret "success-200-isvc-6941b-predictor-serving-cert" not found Apr 17 17:03:44.606913 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.606894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b227b01e-aa15-4737-8813-9e1a7021a599-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.614175 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.614155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9g5\" (UniqueName: \"kubernetes.io/projected/b227b01e-aa15-4737-8813-9e1a7021a599-kube-api-access-nl9g5\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:44.707027 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.706990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtjz\" (UniqueName: \"kubernetes.io/projected/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-kube-api-access-7qtjz\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.707212 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.707031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.707212 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.707115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.707212 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:44.707137 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-6941b-predictor-serving-cert: secret "error-404-isvc-6941b-predictor-serving-cert" not found Apr 17 17:03:44.707212 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:44.707194 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls podName:b3e64c5f-1499-4d3c-b9b1-6a09295446b4 nodeName:}" failed. No retries permitted until 2026-04-17 17:03:45.207175502 +0000 UTC m=+1958.555805030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls") pod "error-404-isvc-6941b-predictor-79dfb7969c-rn424" (UID: "b3e64c5f-1499-4d3c-b9b1-6a09295446b4") : secret "error-404-isvc-6941b-predictor-serving-cert" not found Apr 17 17:03:44.707884 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.707861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.717356 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.717299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtjz\" (UniqueName: \"kubernetes.io/projected/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-kube-api-access-7qtjz\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:44.850295 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.850263 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerID="014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee" exitCode=2 Apr 17 17:03:44.850477 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.850326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" event={"ID":"a7a3aa06-adda-4fdb-b593-939a28b6c765","Type":"ContainerDied","Data":"014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee"} Apr 17 17:03:44.851757 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.851727 2572 generic.go:358] "Generic (PLEG): container finished" podID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerID="b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598" exitCode=2 Apr 17 17:03:44.851852 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:44.851804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" event={"ID":"6b7f5f6d-d978-4a83-8d99-40e024607e2b","Type":"ContainerDied","Data":"b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598"} Apr 17 17:03:45.110969 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.110881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:45.113371 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.113344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls\") pod \"success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:45.211363 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.211330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:45.213622 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.213603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls\") pod \"error-404-isvc-6941b-predictor-79dfb7969c-rn424\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:45.267324 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.267290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:45.363273 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.363184 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:45.375151 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.375034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:45.394832 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.394808 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz"] Apr 17 17:03:45.397488 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:03:45.397456 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb227b01e_aa15_4737_8813_9e1a7021a599.slice/crio-aa117f82ce16d580599a3daa08753e2534ccc162d1037c26a077950fd933aa61 WatchSource:0}: Error finding container aa117f82ce16d580599a3daa08753e2534ccc162d1037c26a077950fd933aa61: Status 404 returned error can't find the container with id aa117f82ce16d580599a3daa08753e2534ccc162d1037c26a077950fd933aa61 Apr 17 17:03:45.511265 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.511243 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424"] Apr 17 17:03:45.523058 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:03:45.521869 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e64c5f_1499_4d3c_b9b1_6a09295446b4.slice/crio-996baae66be7edb90a421bbc743c72ca69dc3b368dc81f409a29bca7ce3d1cd0 WatchSource:0}: Error finding container 996baae66be7edb90a421bbc743c72ca69dc3b368dc81f409a29bca7ce3d1cd0: Status 404 returned error can't find the container with id 996baae66be7edb90a421bbc743c72ca69dc3b368dc81f409a29bca7ce3d1cd0 Apr 17 17:03:45.856722 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.856689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" event={"ID":"b227b01e-aa15-4737-8813-9e1a7021a599","Type":"ContainerStarted","Data":"9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04"} Apr 17 17:03:45.856901 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.856730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" event={"ID":"b227b01e-aa15-4737-8813-9e1a7021a599","Type":"ContainerStarted","Data":"f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081"} Apr 17 17:03:45.856901 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.856745 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" event={"ID":"b227b01e-aa15-4737-8813-9e1a7021a599","Type":"ContainerStarted","Data":"aa117f82ce16d580599a3daa08753e2534ccc162d1037c26a077950fd933aa61"} Apr 17 17:03:45.856901 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.856847 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:45.857060 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.856996 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:45.858384 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.858358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" event={"ID":"b3e64c5f-1499-4d3c-b9b1-6a09295446b4","Type":"ContainerStarted","Data":"5279e05e3a9f4bb29483cc8522cff1bc6baeb05904ae490a13691bc8611f01b4"} Apr 17 17:03:45.858561 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.858391 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" event={"ID":"b3e64c5f-1499-4d3c-b9b1-6a09295446b4","Type":"ContainerStarted","Data":"7a2f206425c2e16462b405062e023b2e5641643672d73764e06d667bb7da6626"} Apr 17 17:03:45.858561 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.858407 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" event={"ID":"b3e64c5f-1499-4d3c-b9b1-6a09295446b4","Type":"ContainerStarted","Data":"996baae66be7edb90a421bbc743c72ca69dc3b368dc81f409a29bca7ce3d1cd0"} Apr 17 17:03:45.858561 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.858420 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:03:45.858742 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.858602 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:45.858742 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.858613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:45.859588 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.859566 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:03:45.874444 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.874395 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podStartSLOduration=1.874383243 podStartE2EDuration="1.874383243s" podCreationTimestamp="2026-04-17 17:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:03:45.872695198 +0000 UTC m=+1959.221324743" watchObservedRunningTime="2026-04-17 17:03:45.874383243 +0000 UTC m=+1959.223012793" Apr 17 17:03:45.889134 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:45.889047 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podStartSLOduration=1.8890349990000002 podStartE2EDuration="1.889034999s" podCreationTimestamp="2026-04-17 17:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:03:45.887381409 +0000 UTC m=+1959.236010982" watchObservedRunningTime="2026-04-17 17:03:45.889034999 +0000 UTC m=+1959.237664548" Apr 17 17:03:46.441216 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:46.441178 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:46.779583 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:46.779547 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:03:46.779778 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:46.779546 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 17:03:46.861479 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:46.861437 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:03:46.861666 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:46.861605 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:03:47.251380 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:47.251338 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 17 17:03:47.256212 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:47.256188 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 17 17:03:48.100862 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.100839 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 17:03:48.216509 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.216488 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 17:03:48.237613 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.237586 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls\") pod \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " Apr 17 17:03:48.237778 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.237704 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b7f5f6d-d978-4a83-8d99-40e024607e2b-success-200-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " Apr 17 17:03:48.237778 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.237735 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4hpr\" (UniqueName: \"kubernetes.io/projected/6b7f5f6d-d978-4a83-8d99-40e024607e2b-kube-api-access-w4hpr\") pod \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\" (UID: \"6b7f5f6d-d978-4a83-8d99-40e024607e2b\") " Apr 17 17:03:48.238071 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.238044 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b7f5f6d-d978-4a83-8d99-40e024607e2b-success-200-isvc-f6018-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f6018-kube-rbac-proxy-sar-config") pod "6b7f5f6d-d978-4a83-8d99-40e024607e2b" (UID: "6b7f5f6d-d978-4a83-8d99-40e024607e2b"). InnerVolumeSpecName "success-200-isvc-f6018-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:03:48.239708 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.239685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7f5f6d-d978-4a83-8d99-40e024607e2b-kube-api-access-w4hpr" (OuterVolumeSpecName: "kube-api-access-w4hpr") pod "6b7f5f6d-d978-4a83-8d99-40e024607e2b" (UID: "6b7f5f6d-d978-4a83-8d99-40e024607e2b"). InnerVolumeSpecName "kube-api-access-w4hpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:03:48.239822 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.239714 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6b7f5f6d-d978-4a83-8d99-40e024607e2b" (UID: "6b7f5f6d-d978-4a83-8d99-40e024607e2b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:03:48.338322 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338291 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7a3aa06-adda-4fdb-b593-939a28b6c765-error-404-isvc-f6018-kube-rbac-proxy-sar-config\") pod \"a7a3aa06-adda-4fdb-b593-939a28b6c765\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " Apr 17 17:03:48.338487 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338350 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7a3aa06-adda-4fdb-b593-939a28b6c765-proxy-tls\") pod \"a7a3aa06-adda-4fdb-b593-939a28b6c765\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " Apr 17 17:03:48.338487 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338378 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr4xl\" (UniqueName: \"kubernetes.io/projected/a7a3aa06-adda-4fdb-b593-939a28b6c765-kube-api-access-wr4xl\") pod \"a7a3aa06-adda-4fdb-b593-939a28b6c765\" (UID: \"a7a3aa06-adda-4fdb-b593-939a28b6c765\") " Apr 17 17:03:48.338564 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338507 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b7f5f6d-d978-4a83-8d99-40e024607e2b-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:48.338564 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338523 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b7f5f6d-d978-4a83-8d99-40e024607e2b-success-200-isvc-f6018-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:48.338564 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338537 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4hpr\" (UniqueName: \"kubernetes.io/projected/6b7f5f6d-d978-4a83-8d99-40e024607e2b-kube-api-access-w4hpr\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:48.338702 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.338674 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a3aa06-adda-4fdb-b593-939a28b6c765-error-404-isvc-f6018-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f6018-kube-rbac-proxy-sar-config") pod "a7a3aa06-adda-4fdb-b593-939a28b6c765" (UID: "a7a3aa06-adda-4fdb-b593-939a28b6c765"). InnerVolumeSpecName "error-404-isvc-f6018-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:03:48.340238 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.340206 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a3aa06-adda-4fdb-b593-939a28b6c765-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a7a3aa06-adda-4fdb-b593-939a28b6c765" (UID: "a7a3aa06-adda-4fdb-b593-939a28b6c765"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:03:48.340339 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.340297 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a3aa06-adda-4fdb-b593-939a28b6c765-kube-api-access-wr4xl" (OuterVolumeSpecName: "kube-api-access-wr4xl") pod "a7a3aa06-adda-4fdb-b593-939a28b6c765" (UID: "a7a3aa06-adda-4fdb-b593-939a28b6c765"). InnerVolumeSpecName "kube-api-access-wr4xl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:03:48.439920 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.439886 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f6018-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7a3aa06-adda-4fdb-b593-939a28b6c765-error-404-isvc-f6018-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:48.439920 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.439916 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7a3aa06-adda-4fdb-b593-939a28b6c765-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:48.440096 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.439929 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wr4xl\" (UniqueName: \"kubernetes.io/projected/a7a3aa06-adda-4fdb-b593-939a28b6c765-kube-api-access-wr4xl\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:48.772927 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.772895 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 17:03:48.870699 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.870660 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerID="3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc" exitCode=0 Apr 17 17:03:48.870856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.870741 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" Apr 17 17:03:48.870856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.870741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" event={"ID":"a7a3aa06-adda-4fdb-b593-939a28b6c765","Type":"ContainerDied","Data":"3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc"} Apr 17 17:03:48.870856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.870785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl" event={"ID":"a7a3aa06-adda-4fdb-b593-939a28b6c765","Type":"ContainerDied","Data":"020a3db0e026f2f1186bc4b1ad53a4dcf96ceb927109b70535b4409dd45b7ef6"} Apr 17 17:03:48.870856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.870809 2572 scope.go:117] "RemoveContainer" containerID="014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee" Apr 17 17:03:48.872216 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.872181 2572 generic.go:358] "Generic (PLEG): container finished" podID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerID="a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d" exitCode=0 Apr 17 17:03:48.872333 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.872221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" event={"ID":"6b7f5f6d-d978-4a83-8d99-40e024607e2b","Type":"ContainerDied","Data":"a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d"} Apr 17 17:03:48.872333 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.872254 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" event={"ID":"6b7f5f6d-d978-4a83-8d99-40e024607e2b","Type":"ContainerDied","Data":"172e7de9ce66100ac4847d08827e4e5095a9543041c93ea2f0d781859f6addac"} Apr 17 17:03:48.872333 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.872253 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw" Apr 17 17:03:48.873447 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.873424 2572 generic.go:358] "Generic (PLEG): container finished" podID="88b45c30-55a8-411a-9555-1cfce612409c" containerID="c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355" exitCode=0 Apr 17 17:03:48.873563 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.873466 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" Apr 17 17:03:48.873563 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.873502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" event={"ID":"88b45c30-55a8-411a-9555-1cfce612409c","Type":"ContainerDied","Data":"c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355"} Apr 17 17:03:48.873563 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.873532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96" event={"ID":"88b45c30-55a8-411a-9555-1cfce612409c","Type":"ContainerDied","Data":"ec4789d98900a954de4199f32f412a8a5845a1d842ed3c4165e99c64a8033564"} Apr 17 17:03:48.879185 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.879169 2572 scope.go:117] "RemoveContainer" containerID="3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc" Apr 17 17:03:48.886536 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.886520 2572 scope.go:117] "RemoveContainer" containerID="014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee" Apr 17 17:03:48.886800 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:48.886781 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee\": container with ID starting with 014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee not found: ID does not exist" containerID="014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee" Apr 17 17:03:48.886878 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.886808 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee"} err="failed to get container status \"014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee\": rpc error: code = NotFound desc = could not find container \"014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee\": container with ID starting with 014201ec7fa4af3b49226be0b36dd8048f0df79883e1392d0c27932941aa90ee not found: ID does not exist" Apr 17 17:03:48.886878 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.886824 2572 scope.go:117] "RemoveContainer" containerID="3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc" Apr 17 17:03:48.887035 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:48.887021 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc\": container with ID starting with 3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc not found: ID does not exist" containerID="3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc" Apr 17 17:03:48.887098 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.887038 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc"} err="failed to get container status \"3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc\": rpc error: code = NotFound desc = could not find container \"3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc\": container with ID starting with 3f0a74dcf28c988021de52825c3ed22e12b87421687ec41bb5f5de0aafe2a1cc not found: ID does not exist" Apr 17 17:03:48.887098 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.887049 2572 scope.go:117] "RemoveContainer" containerID="b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598" Apr 17 17:03:48.893461 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.893443 2572 scope.go:117] "RemoveContainer" containerID="a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d" Apr 17 17:03:48.897391 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.897373 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl"] Apr 17 17:03:48.900999 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.900977 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl"] Apr 17 17:03:48.901477 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.901443 2572 scope.go:117] "RemoveContainer" containerID="b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598" Apr 17 17:03:48.901733 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:48.901714 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598\": container with ID starting with b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598 not found: ID does not exist" containerID="b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598" Apr 17 17:03:48.901780 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.901740 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598"} err="failed to get container status \"b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598\": rpc error: code = NotFound desc = could not find container \"b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598\": container with ID starting with b22e5542aedba72d438d5449937bbb4bcbde38be87c5596c7344d3916dc5a598 not found: ID does not exist" Apr 17 17:03:48.901780 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.901762 2572 scope.go:117] "RemoveContainer" containerID="a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d" Apr 17 17:03:48.902028 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:48.902010 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d\": container with ID starting with a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d not found: ID does not exist" containerID="a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d" Apr 17 17:03:48.902077 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.902033 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d"} err="failed to get container status \"a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d\": rpc error: code = NotFound desc = could not find container \"a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d\": container with ID starting with a839286cea67278538528b3b212f454a4efee41839f67dac04048077ba8a679d not found: ID does not exist" Apr 17 17:03:48.902077 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.902050 2572 scope.go:117] "RemoveContainer" containerID="c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355" Apr 17 17:03:48.908406 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.908390 2572 scope.go:117] "RemoveContainer" containerID="c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355" Apr 17 17:03:48.908619 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:03:48.908604 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355\": container with ID starting with c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355 not found: ID does not exist" containerID="c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355" Apr 17 17:03:48.908702 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.908634 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355"} err="failed to get container status \"c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355\": rpc error: code = NotFound desc = could not find container \"c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355\": container with ID starting with c4fdf06b410647520fcf8ded3a7d4d6b0812aab207fac40eedee91a987383355 not found: ID does not exist" Apr 17 17:03:48.911353 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.911333 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw"] Apr 17 17:03:48.915180 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.915163 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw"] Apr 17 17:03:48.943908 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.943887 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b45c30-55a8-411a-9555-1cfce612409c-openshift-service-ca-bundle\") pod \"88b45c30-55a8-411a-9555-1cfce612409c\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " Apr 17 17:03:48.944005 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.943967 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b45c30-55a8-411a-9555-1cfce612409c-proxy-tls\") pod \"88b45c30-55a8-411a-9555-1cfce612409c\" (UID: \"88b45c30-55a8-411a-9555-1cfce612409c\") " Apr 17 17:03:48.944262 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.944240 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b45c30-55a8-411a-9555-1cfce612409c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "88b45c30-55a8-411a-9555-1cfce612409c" (UID: "88b45c30-55a8-411a-9555-1cfce612409c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:03:48.945968 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:48.945946 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b45c30-55a8-411a-9555-1cfce612409c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "88b45c30-55a8-411a-9555-1cfce612409c" (UID: "88b45c30-55a8-411a-9555-1cfce612409c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:03:49.044791 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.044758 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b45c30-55a8-411a-9555-1cfce612409c-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:49.044791 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.044789 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b45c30-55a8-411a-9555-1cfce612409c-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:03:49.194092 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.194061 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96"] Apr 17 17:03:49.197838 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.197815 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96"] Apr 17 17:03:49.227254 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.227211 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" path="/var/lib/kubelet/pods/6b7f5f6d-d978-4a83-8d99-40e024607e2b/volumes" Apr 17 17:03:49.227822 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.227796 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b45c30-55a8-411a-9555-1cfce612409c" path="/var/lib/kubelet/pods/88b45c30-55a8-411a-9555-1cfce612409c/volumes" Apr 17 17:03:49.228317 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:49.228294 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" path="/var/lib/kubelet/pods/a7a3aa06-adda-4fdb-b593-939a28b6c765/volumes" Apr 17 17:03:51.440972 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:51.440930 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:51.866980 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:51.866951 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:03:51.867184 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:51.867153 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:03:51.867548 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:51.867519 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:03:51.867548 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:51.867529 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:03:56.441435 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:56.441343 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:03:56.441870 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:56.441491 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 17:03:56.779154 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:56.779117 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:03:56.779393 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:03:56.779369 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 17 17:04:01.441569 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:01.441535 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:01.868267 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:01.868227 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:04:01.868464 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:01.868224 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:04:06.441584 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:06.441547 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:06.780598 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:06.780565 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:04:06.780805 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:06.780625 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:04:11.441158 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:11.441117 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:11.867714 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:11.867643 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:04:11.867906 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:11.867643 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:04:14.334245 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.334220 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 17:04:14.356375 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.356350 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0c2def3-ae1a-4e26-9457-c3814b0f6898-openshift-service-ca-bundle\") pod \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " Apr 17 17:04:14.356515 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.356433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c2def3-ae1a-4e26-9457-c3814b0f6898-proxy-tls\") pod \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\" (UID: \"d0c2def3-ae1a-4e26-9457-c3814b0f6898\") " Apr 17 17:04:14.356706 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.356683 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c2def3-ae1a-4e26-9457-c3814b0f6898-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d0c2def3-ae1a-4e26-9457-c3814b0f6898" (UID: "d0c2def3-ae1a-4e26-9457-c3814b0f6898"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:04:14.358358 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.358332 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c2def3-ae1a-4e26-9457-c3814b0f6898-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d0c2def3-ae1a-4e26-9457-c3814b0f6898" (UID: "d0c2def3-ae1a-4e26-9457-c3814b0f6898"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:04:14.457206 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.457127 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0c2def3-ae1a-4e26-9457-c3814b0f6898-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:14.457206 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.457158 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c2def3-ae1a-4e26-9457-c3814b0f6898-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:14.959387 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.959353 2572 generic.go:358] "Generic (PLEG): container finished" podID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerID="146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407" exitCode=0 Apr 17 17:04:14.959588 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.959405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" event={"ID":"d0c2def3-ae1a-4e26-9457-c3814b0f6898","Type":"ContainerDied","Data":"146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407"} Apr 17 17:04:14.959588 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.959414 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" Apr 17 17:04:14.959588 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.959441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg" event={"ID":"d0c2def3-ae1a-4e26-9457-c3814b0f6898","Type":"ContainerDied","Data":"d9a7463b128f6cb702dd3e440b7eba4f9e6ccd38ef0851631451d71f40dfcdc4"} Apr 17 17:04:14.959588 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.959460 2572 scope.go:117] "RemoveContainer" containerID="146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407" Apr 17 17:04:14.967261 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.967240 2572 scope.go:117] "RemoveContainer" containerID="146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407" Apr 17 17:04:14.967628 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:14.967491 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407\": container with ID starting with 146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407 not found: ID does not exist" containerID="146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407" Apr 17 17:04:14.967628 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.967526 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407"} err="failed to get container status \"146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407\": rpc error: code = NotFound desc = could not find container \"146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407\": container with ID starting with 146ef7983643dbe8e244c06b5d293d9c4b899fae403f5a18d5de8981a6e18407 not found: ID does not exist" Apr 17 17:04:14.978804 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.978784 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg"] Apr 17 17:04:14.984446 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:14.984428 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg"] Apr 17 17:04:15.227429 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:15.227331 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" path="/var/lib/kubelet/pods/d0c2def3-ae1a-4e26-9457-c3814b0f6898/volumes" Apr 17 17:04:18.839922 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.839891 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5"] Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840233 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840244 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840256 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kube-rbac-proxy" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840262 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kube-rbac-proxy" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840268 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840274 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840281 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kube-rbac-proxy" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840286 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kube-rbac-proxy" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840297 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840303 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840311 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" Apr 17 17:04:18.840328 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840316 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" Apr 17 17:04:18.840798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840361 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kserve-container" Apr 17 17:04:18.840798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840368 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kserve-container" Apr 17 17:04:18.840798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840375 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0c2def3-ae1a-4e26-9457-c3814b0f6898" containerName="sequence-graph-f6018" Apr 17 17:04:18.840798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840381 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b7f5f6d-d978-4a83-8d99-40e024607e2b" containerName="kube-rbac-proxy" Apr 17 17:04:18.840798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840388 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7a3aa06-adda-4fdb-b593-939a28b6c765" containerName="kube-rbac-proxy" Apr 17 17:04:18.840798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.840393 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="88b45c30-55a8-411a-9555-1cfce612409c" containerName="ensemble-graph-c68dd" Apr 17 17:04:18.844401 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.844385 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:18.846840 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.846782 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-41c4d-kube-rbac-proxy-sar-config\"" Apr 17 17:04:18.846840 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.846834 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-41c4d-serving-cert\"" Apr 17 17:04:18.851693 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.851671 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5"] Apr 17 17:04:18.891971 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.891940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-openshift-service-ca-bundle\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:18.892137 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.892046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:18.993361 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.993331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:18.993520 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.993375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-openshift-service-ca-bundle\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:18.993520 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:18.993471 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-41c4d-serving-cert: secret "splitter-graph-41c4d-serving-cert" not found Apr 17 17:04:18.993624 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:18.993533 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls podName:87459e4c-4eb7-49f2-b1c7-9c3653c7d39d nodeName:}" failed. No retries permitted until 2026-04-17 17:04:19.49351611 +0000 UTC m=+1992.842145637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls") pod "splitter-graph-41c4d-654b49775c-n9pd5" (UID: "87459e4c-4eb7-49f2-b1c7-9c3653c7d39d") : secret "splitter-graph-41c4d-serving-cert" not found Apr 17 17:04:18.993951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:18.993933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-openshift-service-ca-bundle\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:19.497946 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.497911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:19.500411 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.500379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls\") pod \"splitter-graph-41c4d-654b49775c-n9pd5\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:19.755014 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.754926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:19.875217 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.875078 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5"] Apr 17 17:04:19.877945 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:04:19.877911 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87459e4c_4eb7_49f2_b1c7_9c3653c7d39d.slice/crio-8dd09b666def5e13801e95fd40194573cb0763a7b7dc810b811981b912eaa33a WatchSource:0}: Error finding container 8dd09b666def5e13801e95fd40194573cb0763a7b7dc810b811981b912eaa33a: Status 404 returned error can't find the container with id 8dd09b666def5e13801e95fd40194573cb0763a7b7dc810b811981b912eaa33a Apr 17 17:04:19.976824 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.976791 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" event={"ID":"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d","Type":"ContainerStarted","Data":"7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71"} Apr 17 17:04:19.976974 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.976831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" event={"ID":"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d","Type":"ContainerStarted","Data":"8dd09b666def5e13801e95fd40194573cb0763a7b7dc810b811981b912eaa33a"} Apr 17 17:04:19.976974 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.976863 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:19.992772 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:19.992726 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podStartSLOduration=1.992712708 podStartE2EDuration="1.992712708s" podCreationTimestamp="2026-04-17 17:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:04:19.991426582 +0000 UTC m=+1993.340056130" watchObservedRunningTime="2026-04-17 17:04:19.992712708 +0000 UTC m=+1993.341342288" Apr 17 17:04:21.867584 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:21.867548 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:04:21.868017 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:21.867560 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:04:25.985727 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:25.985698 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:28.917364 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:28.917287 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5"] Apr 17 17:04:28.917778 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:28.917478 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" containerID="cri-o://7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71" gracePeriod=30 Apr 17 17:04:29.018552 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.018521 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp"] Apr 17 17:04:29.018910 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.018880 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" containerID="cri-o://2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727" gracePeriod=30 Apr 17 17:04:29.019153 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.018937 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kube-rbac-proxy" containerID="cri-o://c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda" gracePeriod=30 Apr 17 17:04:29.066001 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.065972 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5"] Apr 17 17:04:29.070675 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.070638 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.074170 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.074140 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7afa6-predictor-serving-cert\"" Apr 17 17:04:29.074735 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.074718 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7afa6-kube-rbac-proxy-sar-config\"" Apr 17 17:04:29.082612 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.082583 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5"] Apr 17 17:04:29.084128 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.084106 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb"] Apr 17 17:04:29.084354 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.084335 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" containerID="cri-o://7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636" gracePeriod=30 Apr 17 17:04:29.084450 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.084423 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kube-rbac-proxy" containerID="cri-o://a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a" gracePeriod=30 Apr 17 17:04:29.176732 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.176635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.176887 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.176736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-success-200-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.176887 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.176838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxtp\" (UniqueName: \"kubernetes.io/projected/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-kube-api-access-nmxtp\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.181592 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.181563 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82"] Apr 17 17:04:29.184853 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.184835 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.187768 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.187524 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7afa6-predictor-serving-cert\"" Apr 17 17:04:29.187768 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.187571 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7afa6-kube-rbac-proxy-sar-config\"" Apr 17 17:04:29.198268 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.198242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82"] Apr 17 17:04:29.278104 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.278294 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlzq\" (UniqueName: \"kubernetes.io/projected/16af9bfe-cb61-44e9-bb37-385d3eefa146-kube-api-access-sjlzq\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.278294 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-success-200-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.278294 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16af9bfe-cb61-44e9-bb37-385d3eefa146-error-404-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.278294 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:29.278227 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-serving-cert: secret "success-200-isvc-7afa6-predictor-serving-cert" not found Apr 17 17:04:29.278294 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.278294 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:29.278293 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls podName:0086f8ec-cd25-4666-a36c-d27f5ee1ca4f nodeName:}" failed. No retries permitted until 2026-04-17 17:04:29.778273387 +0000 UTC m=+2003.126902920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls") pod "success-200-isvc-7afa6-predictor-7549c68964-lvtm5" (UID: "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f") : secret "success-200-isvc-7afa6-predictor-serving-cert" not found Apr 17 17:04:29.278573 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxtp\" (UniqueName: \"kubernetes.io/projected/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-kube-api-access-nmxtp\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.278907 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.278887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-success-200-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.288623 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.288599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxtp\" (UniqueName: \"kubernetes.io/projected/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-kube-api-access-nmxtp\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.381929 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.380228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlzq\" (UniqueName: \"kubernetes.io/projected/16af9bfe-cb61-44e9-bb37-385d3eefa146-kube-api-access-sjlzq\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.381929 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.380744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16af9bfe-cb61-44e9-bb37-385d3eefa146-error-404-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.381929 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.380798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.381929 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:29.380986 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-serving-cert: secret "error-404-isvc-7afa6-predictor-serving-cert" not found Apr 17 17:04:29.381929 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:29.381064 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls podName:16af9bfe-cb61-44e9-bb37-385d3eefa146 nodeName:}" failed. No retries permitted until 2026-04-17 17:04:29.881042482 +0000 UTC m=+2003.229672036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls") pod "error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" (UID: "16af9bfe-cb61-44e9-bb37-385d3eefa146") : secret "error-404-isvc-7afa6-predictor-serving-cert" not found Apr 17 17:04:29.382289 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.382068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16af9bfe-cb61-44e9-bb37-385d3eefa146-error-404-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.388458 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.388433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlzq\" (UniqueName: \"kubernetes.io/projected/16af9bfe-cb61-44e9-bb37-385d3eefa146-kube-api-access-sjlzq\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.784367 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.784331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.786557 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.786529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls\") pod \"success-200-isvc-7afa6-predictor-7549c68964-lvtm5\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:29.885623 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.885593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.887965 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.887941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls\") pod \"error-404-isvc-7afa6-predictor-5f65ffc855-lhn82\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:29.980602 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:29.980558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:30.005197 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.005165 2572 generic.go:358] "Generic (PLEG): container finished" podID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerID="c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda" exitCode=2 Apr 17 17:04:30.005344 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.005229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" event={"ID":"286f4108-4887-46df-87ae-8c5cdb92a6e6","Type":"ContainerDied","Data":"c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda"} Apr 17 17:04:30.006811 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.006786 2572 generic.go:358] "Generic (PLEG): container finished" podID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerID="a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a" exitCode=2 Apr 17 17:04:30.006944 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.006821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" event={"ID":"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb","Type":"ContainerDied","Data":"a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a"} Apr 17 17:04:30.098161 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.098128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:30.101911 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.101849 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5"] Apr 17 17:04:30.104901 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:04:30.104876 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0086f8ec_cd25_4666_a36c_d27f5ee1ca4f.slice/crio-0eeb4e1eb401ac590003cd1682c8fe9b6453ff75d72256c18b117fe428a5de4a WatchSource:0}: Error finding container 0eeb4e1eb401ac590003cd1682c8fe9b6453ff75d72256c18b117fe428a5de4a: Status 404 returned error can't find the container with id 0eeb4e1eb401ac590003cd1682c8fe9b6453ff75d72256c18b117fe428a5de4a Apr 17 17:04:30.225584 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.225551 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82"] Apr 17 17:04:30.229973 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:04:30.229948 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16af9bfe_cb61_44e9_bb37_385d3eefa146.slice/crio-50468f72ecdd551c3d95ff9d075dcecca69341656fa66d09d4df87b80b66acfa WatchSource:0}: Error finding container 50468f72ecdd551c3d95ff9d075dcecca69341656fa66d09d4df87b80b66acfa: Status 404 returned error can't find the container with id 50468f72ecdd551c3d95ff9d075dcecca69341656fa66d09d4df87b80b66acfa Apr 17 17:04:30.983734 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:30.983697 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:31.011484 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.011440 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" event={"ID":"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f","Type":"ContainerStarted","Data":"67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93"} Apr 17 17:04:31.011694 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.011622 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:31.011847 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.011829 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:31.011938 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.011855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" event={"ID":"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f","Type":"ContainerStarted","Data":"ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13"} Apr 17 17:04:31.011938 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.011871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" event={"ID":"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f","Type":"ContainerStarted","Data":"0eeb4e1eb401ac590003cd1682c8fe9b6453ff75d72256c18b117fe428a5de4a"} Apr 17 17:04:31.013275 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.013247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" event={"ID":"16af9bfe-cb61-44e9-bb37-385d3eefa146","Type":"ContainerStarted","Data":"b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28"} Apr 17 17:04:31.013379 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.013280 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" event={"ID":"16af9bfe-cb61-44e9-bb37-385d3eefa146","Type":"ContainerStarted","Data":"2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c"} Apr 17 17:04:31.013379 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.013295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" event={"ID":"16af9bfe-cb61-44e9-bb37-385d3eefa146","Type":"ContainerStarted","Data":"50468f72ecdd551c3d95ff9d075dcecca69341656fa66d09d4df87b80b66acfa"} Apr 17 17:04:31.013509 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.013392 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:31.013509 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.013417 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:04:31.033445 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.033393 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podStartSLOduration=2.033377558 podStartE2EDuration="2.033377558s" podCreationTimestamp="2026-04-17 17:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:04:31.029980698 +0000 UTC m=+2004.378610248" watchObservedRunningTime="2026-04-17 17:04:31.033377558 +0000 UTC m=+2004.382007101" Apr 17 17:04:31.048363 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.048307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podStartSLOduration=2.048288055 podStartE2EDuration="2.048288055s" podCreationTimestamp="2026-04-17 17:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:04:31.047814221 +0000 UTC m=+2004.396453324" watchObservedRunningTime="2026-04-17 17:04:31.048288055 +0000 UTC m=+2004.396917618" Apr 17 17:04:31.774410 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.774367 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 17 17:04:31.774588 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.774370 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.44:8643/healthz\": dial tcp 10.132.0.44:8643: connect: connection refused" Apr 17 17:04:31.868011 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.867976 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 17 17:04:31.868176 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:31.867981 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:04:32.016980 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.016941 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:32.017436 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.016999 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:04:32.018157 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.018132 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:04:32.642618 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.642596 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:04:32.707045 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.707022 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-proxy-tls\") pod \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " Apr 17 17:04:32.707170 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.707107 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-error-404-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " Apr 17 17:04:32.707170 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.707155 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2btb\" (UniqueName: \"kubernetes.io/projected/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-kube-api-access-g2btb\") pod \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\" (UID: \"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb\") " Apr 17 17:04:32.707500 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.707472 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-error-404-isvc-41c4d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-41c4d-kube-rbac-proxy-sar-config") pod "d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" (UID: "d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb"). InnerVolumeSpecName "error-404-isvc-41c4d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:04:32.709222 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.709199 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" (UID: "d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:04:32.709293 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.709272 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-kube-api-access-g2btb" (OuterVolumeSpecName: "kube-api-access-g2btb") pod "d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" (UID: "d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb"). InnerVolumeSpecName "kube-api-access-g2btb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:04:32.808078 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.808044 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-error-404-isvc-41c4d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:32.808078 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.808075 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2btb\" (UniqueName: \"kubernetes.io/projected/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-kube-api-access-g2btb\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:32.808078 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:32.808087 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:33.021584 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.021488 2572 generic.go:358] "Generic (PLEG): container finished" podID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerID="7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636" exitCode=0 Apr 17 17:04:33.022053 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.021588 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" Apr 17 17:04:33.022053 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.021605 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" event={"ID":"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb","Type":"ContainerDied","Data":"7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636"} Apr 17 17:04:33.022053 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.021677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb" event={"ID":"d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb","Type":"ContainerDied","Data":"910f64eea9f4cb5a7b84dd6689410f20645f07d3422adfaba60a24d2496a8160"} Apr 17 17:04:33.022053 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.021704 2572 scope.go:117] "RemoveContainer" containerID="a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a" Apr 17 17:04:33.022366 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.022341 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:04:33.032506 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.032482 2572 scope.go:117] "RemoveContainer" containerID="7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636" Apr 17 17:04:33.039414 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.039374 2572 scope.go:117] "RemoveContainer" containerID="a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a" Apr 17 17:04:33.039626 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:33.039596 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a\": container with ID starting with a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a not found: ID does not exist" containerID="a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a" Apr 17 17:04:33.039688 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.039617 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a"} err="failed to get container status \"a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a\": rpc error: code = NotFound desc = could not find container \"a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a\": container with ID starting with a53a6e33a3f61fb2825bddeddcb6fe57603b6e9304b4519869aa6f1dd1da8e8a not found: ID does not exist" Apr 17 17:04:33.039688 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.039634 2572 scope.go:117] "RemoveContainer" containerID="7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636" Apr 17 17:04:33.039901 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:33.039883 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636\": container with ID starting with 7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636 not found: ID does not exist" containerID="7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636" Apr 17 17:04:33.039943 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.039909 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636"} err="failed to get container status \"7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636\": rpc error: code = NotFound desc = could not find container \"7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636\": container with ID starting with 7699ca43b73cb155734938667d29f1b0ee60132c72fa4fd3e09a35e704cb7636 not found: ID does not exist" Apr 17 17:04:33.046914 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.046892 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb"] Apr 17 17:04:33.050579 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.050556 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb"] Apr 17 17:04:33.157004 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.156977 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:04:33.211927 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.211896 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/286f4108-4887-46df-87ae-8c5cdb92a6e6-success-200-isvc-41c4d-kube-rbac-proxy-sar-config\") pod \"286f4108-4887-46df-87ae-8c5cdb92a6e6\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " Apr 17 17:04:33.212087 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.211951 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/286f4108-4887-46df-87ae-8c5cdb92a6e6-proxy-tls\") pod \"286f4108-4887-46df-87ae-8c5cdb92a6e6\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " Apr 17 17:04:33.212087 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.211997 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcxnd\" (UniqueName: \"kubernetes.io/projected/286f4108-4887-46df-87ae-8c5cdb92a6e6-kube-api-access-kcxnd\") pod \"286f4108-4887-46df-87ae-8c5cdb92a6e6\" (UID: \"286f4108-4887-46df-87ae-8c5cdb92a6e6\") " Apr 17 17:04:33.212283 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.212259 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/286f4108-4887-46df-87ae-8c5cdb92a6e6-success-200-isvc-41c4d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-41c4d-kube-rbac-proxy-sar-config") pod "286f4108-4887-46df-87ae-8c5cdb92a6e6" (UID: "286f4108-4887-46df-87ae-8c5cdb92a6e6"). InnerVolumeSpecName "success-200-isvc-41c4d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:04:33.214051 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.214028 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/286f4108-4887-46df-87ae-8c5cdb92a6e6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "286f4108-4887-46df-87ae-8c5cdb92a6e6" (UID: "286f4108-4887-46df-87ae-8c5cdb92a6e6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:04:33.214120 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.214105 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/286f4108-4887-46df-87ae-8c5cdb92a6e6-kube-api-access-kcxnd" (OuterVolumeSpecName: "kube-api-access-kcxnd") pod "286f4108-4887-46df-87ae-8c5cdb92a6e6" (UID: "286f4108-4887-46df-87ae-8c5cdb92a6e6"). InnerVolumeSpecName "kube-api-access-kcxnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:04:33.228955 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.228924 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" path="/var/lib/kubelet/pods/d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb/volumes" Apr 17 17:04:33.313027 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.312963 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/286f4108-4887-46df-87ae-8c5cdb92a6e6-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:33.313027 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.312989 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kcxnd\" (UniqueName: \"kubernetes.io/projected/286f4108-4887-46df-87ae-8c5cdb92a6e6-kube-api-access-kcxnd\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:33.313027 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:33.313007 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-41c4d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/286f4108-4887-46df-87ae-8c5cdb92a6e6-success-200-isvc-41c4d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:34.033522 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.033488 2572 generic.go:358] "Generic (PLEG): container finished" podID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerID="2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727" exitCode=0 Apr 17 17:04:34.033980 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.033533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" event={"ID":"286f4108-4887-46df-87ae-8c5cdb92a6e6","Type":"ContainerDied","Data":"2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727"} Apr 17 17:04:34.033980 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.033559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" event={"ID":"286f4108-4887-46df-87ae-8c5cdb92a6e6","Type":"ContainerDied","Data":"33cd43eaca7f73cadb60c51c525f0c48583c96eb3850d128e1028501eac194e7"} Apr 17 17:04:34.033980 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.033574 2572 scope.go:117] "RemoveContainer" containerID="c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda" Apr 17 17:04:34.033980 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.033576 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp" Apr 17 17:04:34.043175 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.043156 2572 scope.go:117] "RemoveContainer" containerID="2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727" Apr 17 17:04:34.050353 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.050334 2572 scope.go:117] "RemoveContainer" containerID="c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda" Apr 17 17:04:34.050614 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:34.050592 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda\": container with ID starting with c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda not found: ID does not exist" containerID="c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda" Apr 17 17:04:34.050694 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.050622 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda"} err="failed to get container status \"c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda\": rpc error: code = NotFound desc = could not find container \"c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda\": container with ID starting with c65a723acc82a71085d4727ab22ae5963a034328d9b64087f6d4eab56b49dcda not found: ID does not exist" Apr 17 17:04:34.050694 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.050639 2572 scope.go:117] "RemoveContainer" containerID="2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727" Apr 17 17:04:34.050916 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:34.050900 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727\": container with ID starting with 2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727 not found: ID does not exist" containerID="2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727" Apr 17 17:04:34.050960 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.050921 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727"} err="failed to get container status \"2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727\": rpc error: code = NotFound desc = could not find container \"2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727\": container with ID starting with 2dbc759f67c0cef691897693a8bd8b6fb3d3748b6177ab25dc3273494df34727 not found: ID does not exist" Apr 17 17:04:34.051416 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.051396 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp"] Apr 17 17:04:34.053300 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:34.053281 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp"] Apr 17 17:04:35.227799 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:35.227758 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" path="/var/lib/kubelet/pods/286f4108-4887-46df-87ae-8c5cdb92a6e6/volumes" Apr 17 17:04:35.983844 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:35.983810 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:37.021891 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:37.021865 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:04:37.022369 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:37.022338 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:04:38.027342 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:38.027309 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:04:38.027947 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:38.027917 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:04:40.983571 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:40.983527 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:40.984098 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:40.983692 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:41.868336 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:41.868302 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:04:41.868555 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:41.868362 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:04:45.983994 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:45.983955 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:47.022660 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:47.022618 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:04:48.028802 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:48.028764 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:04:50.983466 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:50.983427 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:54.416835 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.416799 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx"] Apr 17 17:04:54.417260 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417245 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" Apr 17 17:04:54.417304 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417262 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" Apr 17 17:04:54.417304 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417284 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kube-rbac-proxy" Apr 17 17:04:54.417304 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417292 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kube-rbac-proxy" Apr 17 17:04:54.417401 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417315 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kube-rbac-proxy" Apr 17 17:04:54.417401 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417324 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kube-rbac-proxy" Apr 17 17:04:54.417401 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417336 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" Apr 17 17:04:54.417401 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417344 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" Apr 17 17:04:54.417525 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417410 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kube-rbac-proxy" Apr 17 17:04:54.417525 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417422 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kube-rbac-proxy" Apr 17 17:04:54.417525 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417436 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1ecf353-5c1a-48eb-8bd5-66b1a725f4bb" containerName="kserve-container" Apr 17 17:04:54.417525 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.417448 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="286f4108-4887-46df-87ae-8c5cdb92a6e6" containerName="kserve-container" Apr 17 17:04:54.430142 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.430110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx"] Apr 17 17:04:54.430281 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.430196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.432571 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.432542 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-6941b-kube-rbac-proxy-sar-config\"" Apr 17 17:04:54.432738 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.432584 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-6941b-serving-cert\"" Apr 17 17:04:54.590021 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.589993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9efa573-7e51-407e-9a19-731c1efdda17-openshift-service-ca-bundle\") pod \"switch-graph-6941b-869c44b684-24nwx\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.590021 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.590026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9efa573-7e51-407e-9a19-731c1efdda17-proxy-tls\") pod \"switch-graph-6941b-869c44b684-24nwx\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.691143 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.691057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9efa573-7e51-407e-9a19-731c1efdda17-openshift-service-ca-bundle\") pod \"switch-graph-6941b-869c44b684-24nwx\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.691143 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.691096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9efa573-7e51-407e-9a19-731c1efdda17-proxy-tls\") pod \"switch-graph-6941b-869c44b684-24nwx\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.691708 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.691691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9efa573-7e51-407e-9a19-731c1efdda17-openshift-service-ca-bundle\") pod \"switch-graph-6941b-869c44b684-24nwx\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.693446 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.693429 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9efa573-7e51-407e-9a19-731c1efdda17-proxy-tls\") pod \"switch-graph-6941b-869c44b684-24nwx\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.741512 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.741485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:54.862756 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:54.862726 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx"] Apr 17 17:04:54.865318 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:04:54.865275 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9efa573_7e51_407e_9a19_731c1efdda17.slice/crio-e7951b3b1d85fd96896b6300ca0de50be066e1d9f03173c0a16c7757f479f664 WatchSource:0}: Error finding container e7951b3b1d85fd96896b6300ca0de50be066e1d9f03173c0a16c7757f479f664: Status 404 returned error can't find the container with id e7951b3b1d85fd96896b6300ca0de50be066e1d9f03173c0a16c7757f479f664 Apr 17 17:04:55.099108 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:55.099074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" event={"ID":"c9efa573-7e51-407e-9a19-731c1efdda17","Type":"ContainerStarted","Data":"f30afc9378bc74e10fb8d7c8531f4b59b5b8e6c9cdabdf01b8c5d26da650ff33"} Apr 17 17:04:55.099108 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:55.099111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" event={"ID":"c9efa573-7e51-407e-9a19-731c1efdda17","Type":"ContainerStarted","Data":"e7951b3b1d85fd96896b6300ca0de50be066e1d9f03173c0a16c7757f479f664"} Apr 17 17:04:55.099315 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:55.099141 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:04:55.115353 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:55.115301 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podStartSLOduration=1.115283972 podStartE2EDuration="1.115283972s" podCreationTimestamp="2026-04-17 17:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:04:55.114016149 +0000 UTC m=+2028.462645697" watchObservedRunningTime="2026-04-17 17:04:55.115283972 +0000 UTC m=+2028.463913525" Apr 17 17:04:55.984021 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:55.983980 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:04:57.022503 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:57.022464 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:04:58.028891 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:58.028851 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:04:59.057104 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.057077 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:59.110459 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.110424 2572 generic.go:358] "Generic (PLEG): container finished" podID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerID="7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71" exitCode=0 Apr 17 17:04:59.110595 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.110485 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" Apr 17 17:04:59.110595 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.110514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" event={"ID":"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d","Type":"ContainerDied","Data":"7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71"} Apr 17 17:04:59.110595 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.110546 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5" event={"ID":"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d","Type":"ContainerDied","Data":"8dd09b666def5e13801e95fd40194573cb0763a7b7dc810b811981b912eaa33a"} Apr 17 17:04:59.110595 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.110561 2572 scope.go:117] "RemoveContainer" containerID="7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71" Apr 17 17:04:59.117982 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.117962 2572 scope.go:117] "RemoveContainer" containerID="7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71" Apr 17 17:04:59.118227 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:04:59.118208 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71\": container with ID starting with 7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71 not found: ID does not exist" containerID="7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71" Apr 17 17:04:59.118291 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.118235 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71"} err="failed to get container status \"7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71\": rpc error: code = NotFound desc = could not find container \"7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71\": container with ID starting with 7dc49050815806058a3580b4286b04349ee4bd3b67850635e5cf4d1ff7f42d71 not found: ID does not exist" Apr 17 17:04:59.230455 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.230433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls\") pod \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " Apr 17 17:04:59.230629 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.230517 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-openshift-service-ca-bundle\") pod \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\" (UID: \"87459e4c-4eb7-49f2-b1c7-9c3653c7d39d\") " Apr 17 17:04:59.230870 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.230847 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" (UID: "87459e4c-4eb7-49f2-b1c7-9c3653c7d39d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:04:59.232425 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.232399 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" (UID: "87459e4c-4eb7-49f2-b1c7-9c3653c7d39d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:04:59.331147 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.331118 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:59.331147 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.331150 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:04:59.431187 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.431158 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5"] Apr 17 17:04:59.434746 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:04:59.434722 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5"] Apr 17 17:05:01.106918 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:01.106891 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:05:01.226371 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:01.226338 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" path="/var/lib/kubelet/pods/87459e4c-4eb7-49f2-b1c7-9c3653c7d39d/volumes" Apr 17 17:05:07.022768 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:07.022727 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:05:08.028590 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:08.028548 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:05:17.022550 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:17.022507 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 17 17:05:18.028838 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:18.028803 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:05:27.023757 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:27.023679 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:05:39.132765 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.132732 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4"] Apr 17 17:05:39.133150 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.133072 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" Apr 17 17:05:39.133150 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.133085 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" Apr 17 17:05:39.133150 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.133148 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="87459e4c-4eb7-49f2-b1c7-9c3653c7d39d" containerName="splitter-graph-41c4d" Apr 17 17:05:39.135938 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.135920 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.138172 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.138150 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-7afa6-serving-cert\"" Apr 17 17:05:39.138297 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.138149 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-7afa6-kube-rbac-proxy-sar-config\"" Apr 17 17:05:39.142559 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.142535 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4"] Apr 17 17:05:39.240159 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.240125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd6d8cec-e328-49cc-a077-8d0b71e06a76-openshift-service-ca-bundle\") pod \"splitter-graph-7afa6-7fb6d7495d-5g5l4\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.240330 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.240168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd6d8cec-e328-49cc-a077-8d0b71e06a76-proxy-tls\") pod \"splitter-graph-7afa6-7fb6d7495d-5g5l4\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.341122 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.341092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd6d8cec-e328-49cc-a077-8d0b71e06a76-openshift-service-ca-bundle\") pod \"splitter-graph-7afa6-7fb6d7495d-5g5l4\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.341298 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.341134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd6d8cec-e328-49cc-a077-8d0b71e06a76-proxy-tls\") pod \"splitter-graph-7afa6-7fb6d7495d-5g5l4\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.341734 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.341711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd6d8cec-e328-49cc-a077-8d0b71e06a76-openshift-service-ca-bundle\") pod \"splitter-graph-7afa6-7fb6d7495d-5g5l4\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.343542 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.343522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd6d8cec-e328-49cc-a077-8d0b71e06a76-proxy-tls\") pod \"splitter-graph-7afa6-7fb6d7495d-5g5l4\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.447414 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.447389 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:39.563846 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:39.563814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4"] Apr 17 17:05:39.567107 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:05:39.567078 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd6d8cec_e328_49cc_a077_8d0b71e06a76.slice/crio-da10d34c12bf5cc31c22b5ebc47c5038d527578f0408f37afc51ac1e8d1a8c53 WatchSource:0}: Error finding container da10d34c12bf5cc31c22b5ebc47c5038d527578f0408f37afc51ac1e8d1a8c53: Status 404 returned error can't find the container with id da10d34c12bf5cc31c22b5ebc47c5038d527578f0408f37afc51ac1e8d1a8c53 Apr 17 17:05:40.231496 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:40.231457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" event={"ID":"fd6d8cec-e328-49cc-a077-8d0b71e06a76","Type":"ContainerStarted","Data":"85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e"} Apr 17 17:05:40.231496 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:40.231497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" event={"ID":"fd6d8cec-e328-49cc-a077-8d0b71e06a76","Type":"ContainerStarted","Data":"da10d34c12bf5cc31c22b5ebc47c5038d527578f0408f37afc51ac1e8d1a8c53"} Apr 17 17:05:40.232027 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:40.231557 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:05:40.247621 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:40.247576 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podStartSLOduration=1.247564584 podStartE2EDuration="1.247564584s" podCreationTimestamp="2026-04-17 17:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:05:40.246751578 +0000 UTC m=+2073.595381127" watchObservedRunningTime="2026-04-17 17:05:40.247564584 +0000 UTC m=+2073.596194132" Apr 17 17:05:46.239358 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:05:46.239329 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:13:53.856442 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.856397 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4"] Apr 17 17:13:53.858875 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.856731 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" containerID="cri-o://85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e" gracePeriod=30 Apr 17 17:13:53.936813 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.936779 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5"] Apr 17 17:13:53.937200 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.937170 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" containerID="cri-o://ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13" gracePeriod=30 Apr 17 17:13:53.937281 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.937194 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kube-rbac-proxy" containerID="cri-o://67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93" gracePeriod=30 Apr 17 17:13:53.975169 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.975137 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82"] Apr 17 17:13:53.975525 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.975497 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" containerID="cri-o://2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c" gracePeriod=30 Apr 17 17:13:53.975703 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:53.975531 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kube-rbac-proxy" containerID="cri-o://b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28" gracePeriod=30 Apr 17 17:13:54.005223 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:13:54.005170 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16af9bfe_cb61_44e9_bb37_385d3eefa146.slice/crio-conmon-b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:13:54.005223 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:13:54.005187 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16af9bfe_cb61_44e9_bb37_385d3eefa146.slice/crio-conmon-b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:13:54.591552 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:54.591521 2572 generic.go:358] "Generic (PLEG): container finished" podID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerID="b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28" exitCode=2 Apr 17 17:13:54.591755 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:54.591591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" event={"ID":"16af9bfe-cb61-44e9-bb37-385d3eefa146","Type":"ContainerDied","Data":"b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28"} Apr 17 17:13:54.593058 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:54.593037 2572 generic.go:358] "Generic (PLEG): container finished" podID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerID="67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93" exitCode=2 Apr 17 17:13:54.593107 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:54.593097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" event={"ID":"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f","Type":"ContainerDied","Data":"67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93"} Apr 17 17:13:56.238522 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:56.238480 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:13:57.020538 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.020510 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:13:57.035072 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.035054 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:13:57.121338 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121245 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16af9bfe-cb61-44e9-bb37-385d3eefa146-error-404-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"16af9bfe-cb61-44e9-bb37-385d3eefa146\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " Apr 17 17:13:57.121338 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121292 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjlzq\" (UniqueName: \"kubernetes.io/projected/16af9bfe-cb61-44e9-bb37-385d3eefa146-kube-api-access-sjlzq\") pod \"16af9bfe-cb61-44e9-bb37-385d3eefa146\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " Apr 17 17:13:57.121338 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121338 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmxtp\" (UniqueName: \"kubernetes.io/projected/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-kube-api-access-nmxtp\") pod \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " Apr 17 17:13:57.121623 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121366 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls\") pod \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " Apr 17 17:13:57.121623 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-success-200-isvc-7afa6-kube-rbac-proxy-sar-config\") pod \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\" (UID: \"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f\") " Apr 17 17:13:57.121623 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121475 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls\") pod \"16af9bfe-cb61-44e9-bb37-385d3eefa146\" (UID: \"16af9bfe-cb61-44e9-bb37-385d3eefa146\") " Apr 17 17:13:57.121835 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121669 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16af9bfe-cb61-44e9-bb37-385d3eefa146-error-404-isvc-7afa6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-7afa6-kube-rbac-proxy-sar-config") pod "16af9bfe-cb61-44e9-bb37-385d3eefa146" (UID: "16af9bfe-cb61-44e9-bb37-385d3eefa146"). InnerVolumeSpecName "error-404-isvc-7afa6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:57.121930 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.121899 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-success-200-isvc-7afa6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-7afa6-kube-rbac-proxy-sar-config") pod "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" (UID: "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f"). InnerVolumeSpecName "success-200-isvc-7afa6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:57.123541 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.123519 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16af9bfe-cb61-44e9-bb37-385d3eefa146-kube-api-access-sjlzq" (OuterVolumeSpecName: "kube-api-access-sjlzq") pod "16af9bfe-cb61-44e9-bb37-385d3eefa146" (UID: "16af9bfe-cb61-44e9-bb37-385d3eefa146"). InnerVolumeSpecName "kube-api-access-sjlzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:13:57.123541 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.123522 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "16af9bfe-cb61-44e9-bb37-385d3eefa146" (UID: "16af9bfe-cb61-44e9-bb37-385d3eefa146"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:13:57.123895 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.123874 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-kube-api-access-nmxtp" (OuterVolumeSpecName: "kube-api-access-nmxtp") pod "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" (UID: "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f"). InnerVolumeSpecName "kube-api-access-nmxtp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:13:57.123960 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.123876 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" (UID: "0086f8ec-cd25-4666-a36c-d27f5ee1ca4f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:13:57.222586 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.222548 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjlzq\" (UniqueName: \"kubernetes.io/projected/16af9bfe-cb61-44e9-bb37-385d3eefa146-kube-api-access-sjlzq\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.222586 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.222579 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmxtp\" (UniqueName: \"kubernetes.io/projected/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-kube-api-access-nmxtp\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.222586 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.222591 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.222846 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.222602 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f-success-200-isvc-7afa6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.222846 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.222612 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16af9bfe-cb61-44e9-bb37-385d3eefa146-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.222846 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.222621 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-7afa6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16af9bfe-cb61-44e9-bb37-385d3eefa146-error-404-isvc-7afa6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.603084 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.603050 2572 generic.go:358] "Generic (PLEG): container finished" podID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerID="2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c" exitCode=0 Apr 17 17:13:57.603547 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.603129 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" Apr 17 17:13:57.603547 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.603141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" event={"ID":"16af9bfe-cb61-44e9-bb37-385d3eefa146","Type":"ContainerDied","Data":"2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c"} Apr 17 17:13:57.603547 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.603190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82" event={"ID":"16af9bfe-cb61-44e9-bb37-385d3eefa146","Type":"ContainerDied","Data":"50468f72ecdd551c3d95ff9d075dcecca69341656fa66d09d4df87b80b66acfa"} Apr 17 17:13:57.603547 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.603216 2572 scope.go:117] "RemoveContainer" containerID="b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28" Apr 17 17:13:57.604859 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.604823 2572 generic.go:358] "Generic (PLEG): container finished" podID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerID="ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13" exitCode=0 Apr 17 17:13:57.604974 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.604885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" event={"ID":"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f","Type":"ContainerDied","Data":"ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13"} Apr 17 17:13:57.604974 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.604892 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" Apr 17 17:13:57.604974 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.604909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" event={"ID":"0086f8ec-cd25-4666-a36c-d27f5ee1ca4f","Type":"ContainerDied","Data":"0eeb4e1eb401ac590003cd1682c8fe9b6453ff75d72256c18b117fe428a5de4a"} Apr 17 17:13:57.611856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.611828 2572 scope.go:117] "RemoveContainer" containerID="2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c" Apr 17 17:13:57.618852 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.618835 2572 scope.go:117] "RemoveContainer" containerID="b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28" Apr 17 17:13:57.619113 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:13:57.619097 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28\": container with ID starting with b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28 not found: ID does not exist" containerID="b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28" Apr 17 17:13:57.619163 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.619121 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28"} err="failed to get container status \"b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28\": rpc error: code = NotFound desc = could not find container \"b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28\": container with ID starting with b13340622ed0115046087a1ddc9ed0ad93cccf8e28ca2d8af70c22d49f8c7b28 not found: ID does not exist" Apr 17 17:13:57.619163 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.619141 2572 scope.go:117] "RemoveContainer" containerID="2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c" Apr 17 17:13:57.619354 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:13:57.619335 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c\": container with ID starting with 2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c not found: ID does not exist" containerID="2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c" Apr 17 17:13:57.619398 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.619361 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c"} err="failed to get container status \"2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c\": rpc error: code = NotFound desc = could not find container \"2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c\": container with ID starting with 2a49be564c8721579c57f534e410037d3dd1eb76db319b2d34ce4643b2f6396c not found: ID does not exist" Apr 17 17:13:57.619398 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.619377 2572 scope.go:117] "RemoveContainer" containerID="67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93" Apr 17 17:13:57.624395 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.624375 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82"] Apr 17 17:13:57.627319 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.627299 2572 scope.go:117] "RemoveContainer" containerID="ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13" Apr 17 17:13:57.628332 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.628316 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82"] Apr 17 17:13:57.633953 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.633940 2572 scope.go:117] "RemoveContainer" containerID="67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93" Apr 17 17:13:57.634179 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:13:57.634161 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93\": container with ID starting with 67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93 not found: ID does not exist" containerID="67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93" Apr 17 17:13:57.634219 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.634186 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93"} err="failed to get container status \"67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93\": rpc error: code = NotFound desc = could not find container \"67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93\": container with ID starting with 67e6a5ab8d355003d9f0733b418260408352ca30445add11b843e58efd808a93 not found: ID does not exist" Apr 17 17:13:57.634219 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.634203 2572 scope.go:117] "RemoveContainer" containerID="ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13" Apr 17 17:13:57.634434 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:13:57.634417 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13\": container with ID starting with ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13 not found: ID does not exist" containerID="ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13" Apr 17 17:13:57.634479 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.634440 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13"} err="failed to get container status \"ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13\": rpc error: code = NotFound desc = could not find container \"ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13\": container with ID starting with ff79004acc7db6c3c3a5fa7da8ed5904b2674567f1a2cd9794a7046123d02c13 not found: ID does not exist" Apr 17 17:13:57.637691 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.637671 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5"] Apr 17 17:13:57.641161 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:57.641141 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5"] Apr 17 17:13:58.017798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:58.017746 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.48:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 17 17:13:59.226406 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:59.226372 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" path="/var/lib/kubelet/pods/0086f8ec-cd25-4666-a36c-d27f5ee1ca4f/volumes" Apr 17 17:13:59.226868 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:13:59.226789 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" path="/var/lib/kubelet/pods/16af9bfe-cb61-44e9-bb37-385d3eefa146/volumes" Apr 17 17:14:01.239147 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:01.239105 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:14:06.238362 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:06.238322 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:14:06.238812 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:06.238440 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:14:11.238926 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:11.238887 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:14:16.238209 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:16.238168 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:14:21.238564 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:21.238528 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:14:24.011696 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.011674 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:14:24.036233 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.036208 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd6d8cec-e328-49cc-a077-8d0b71e06a76-proxy-tls\") pod \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " Apr 17 17:14:24.036374 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.036272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd6d8cec-e328-49cc-a077-8d0b71e06a76-openshift-service-ca-bundle\") pod \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\" (UID: \"fd6d8cec-e328-49cc-a077-8d0b71e06a76\") " Apr 17 17:14:24.036631 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.036605 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd6d8cec-e328-49cc-a077-8d0b71e06a76-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fd6d8cec-e328-49cc-a077-8d0b71e06a76" (UID: "fd6d8cec-e328-49cc-a077-8d0b71e06a76"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:14:24.038303 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.038282 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6d8cec-e328-49cc-a077-8d0b71e06a76-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fd6d8cec-e328-49cc-a077-8d0b71e06a76" (UID: "fd6d8cec-e328-49cc-a077-8d0b71e06a76"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:14:24.137172 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.137091 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd6d8cec-e328-49cc-a077-8d0b71e06a76-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:14:24.137172 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.137121 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd6d8cec-e328-49cc-a077-8d0b71e06a76-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:14:24.683554 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.683523 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerID="85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e" exitCode=0 Apr 17 17:14:24.683777 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.683588 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" Apr 17 17:14:24.683777 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.683602 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" event={"ID":"fd6d8cec-e328-49cc-a077-8d0b71e06a76","Type":"ContainerDied","Data":"85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e"} Apr 17 17:14:24.683777 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.683638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4" event={"ID":"fd6d8cec-e328-49cc-a077-8d0b71e06a76","Type":"ContainerDied","Data":"da10d34c12bf5cc31c22b5ebc47c5038d527578f0408f37afc51ac1e8d1a8c53"} Apr 17 17:14:24.683777 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.683678 2572 scope.go:117] "RemoveContainer" containerID="85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e" Apr 17 17:14:24.691305 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.691216 2572 scope.go:117] "RemoveContainer" containerID="85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e" Apr 17 17:14:24.691518 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:14:24.691488 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e\": container with ID starting with 85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e not found: ID does not exist" containerID="85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e" Apr 17 17:14:24.691590 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.691527 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e"} err="failed to get container status \"85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e\": rpc error: code = NotFound desc = could not find container \"85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e\": container with ID starting with 85c08eb2c9258547c97b2071bec4e673192c982ed411d2a1ce4518bcc13b8f5e not found: ID does not exist" Apr 17 17:14:24.703022 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.702999 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4"] Apr 17 17:14:24.707111 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:24.707091 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4"] Apr 17 17:14:25.226454 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:14:25.226415 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" path="/var/lib/kubelet/pods/fd6d8cec-e328-49cc-a077-8d0b71e06a76/volumes" Apr 17 17:21:13.835966 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:13.835932 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx"] Apr 17 17:21:13.838403 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:13.836178 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" containerID="cri-o://f30afc9378bc74e10fb8d7c8531f4b59b5b8e6c9cdabdf01b8c5d26da650ff33" gracePeriod=30 Apr 17 17:21:13.958437 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:13.958401 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz"] Apr 17 17:21:13.958732 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:13.958707 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" containerID="cri-o://f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081" gracePeriod=30 Apr 17 17:21:13.958812 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:13.958786 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kube-rbac-proxy" containerID="cri-o://9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04" gracePeriod=30 Apr 17 17:21:14.029558 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.029524 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424"] Apr 17 17:21:14.029829 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.029805 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" containerID="cri-o://7a2f206425c2e16462b405062e023b2e5641643672d73764e06d667bb7da6626" gracePeriod=30 Apr 17 17:21:14.029922 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.029865 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kube-rbac-proxy" containerID="cri-o://5279e05e3a9f4bb29483cc8522cff1bc6baeb05904ae490a13691bc8611f01b4" gracePeriod=30 Apr 17 17:21:14.803597 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.803564 2572 generic.go:358] "Generic (PLEG): container finished" podID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerID="5279e05e3a9f4bb29483cc8522cff1bc6baeb05904ae490a13691bc8611f01b4" exitCode=2 Apr 17 17:21:14.803786 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.803629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" event={"ID":"b3e64c5f-1499-4d3c-b9b1-6a09295446b4","Type":"ContainerDied","Data":"5279e05e3a9f4bb29483cc8522cff1bc6baeb05904ae490a13691bc8611f01b4"} Apr 17 17:21:14.804964 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.804943 2572 generic.go:358] "Generic (PLEG): container finished" podID="b227b01e-aa15-4737-8813-9e1a7021a599" containerID="9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04" exitCode=2 Apr 17 17:21:14.805079 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:14.804986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" event={"ID":"b227b01e-aa15-4737-8813-9e1a7021a599","Type":"ContainerDied","Data":"9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04"} Apr 17 17:21:16.105430 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.105389 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:21:16.698734 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.698703 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:21:16.812879 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.812799 2572 generic.go:358] "Generic (PLEG): container finished" podID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerID="7a2f206425c2e16462b405062e023b2e5641643672d73764e06d667bb7da6626" exitCode=0 Apr 17 17:21:16.813002 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.812879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" event={"ID":"b3e64c5f-1499-4d3c-b9b1-6a09295446b4","Type":"ContainerDied","Data":"7a2f206425c2e16462b405062e023b2e5641643672d73764e06d667bb7da6626"} Apr 17 17:21:16.814096 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.814072 2572 generic.go:358] "Generic (PLEG): container finished" podID="b227b01e-aa15-4737-8813-9e1a7021a599" containerID="f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081" exitCode=0 Apr 17 17:21:16.814204 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.814109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" event={"ID":"b227b01e-aa15-4737-8813-9e1a7021a599","Type":"ContainerDied","Data":"f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081"} Apr 17 17:21:16.814204 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.814132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" event={"ID":"b227b01e-aa15-4737-8813-9e1a7021a599","Type":"ContainerDied","Data":"aa117f82ce16d580599a3daa08753e2534ccc162d1037c26a077950fd933aa61"} Apr 17 17:21:16.814204 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.814154 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz" Apr 17 17:21:16.814325 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.814155 2572 scope.go:117] "RemoveContainer" containerID="9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04" Apr 17 17:21:16.817570 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.817546 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b227b01e-aa15-4737-8813-9e1a7021a599-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"b227b01e-aa15-4737-8813-9e1a7021a599\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " Apr 17 17:21:16.817689 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.817667 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls\") pod \"b227b01e-aa15-4737-8813-9e1a7021a599\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " Apr 17 17:21:16.817772 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.817703 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl9g5\" (UniqueName: \"kubernetes.io/projected/b227b01e-aa15-4737-8813-9e1a7021a599-kube-api-access-nl9g5\") pod \"b227b01e-aa15-4737-8813-9e1a7021a599\" (UID: \"b227b01e-aa15-4737-8813-9e1a7021a599\") " Apr 17 17:21:16.817923 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.817900 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b227b01e-aa15-4737-8813-9e1a7021a599-success-200-isvc-6941b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6941b-kube-rbac-proxy-sar-config") pod "b227b01e-aa15-4737-8813-9e1a7021a599" (UID: "b227b01e-aa15-4737-8813-9e1a7021a599"). InnerVolumeSpecName "success-200-isvc-6941b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:21:16.819790 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.819763 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b227b01e-aa15-4737-8813-9e1a7021a599-kube-api-access-nl9g5" (OuterVolumeSpecName: "kube-api-access-nl9g5") pod "b227b01e-aa15-4737-8813-9e1a7021a599" (UID: "b227b01e-aa15-4737-8813-9e1a7021a599"). InnerVolumeSpecName "kube-api-access-nl9g5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:21:16.819790 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.819766 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b227b01e-aa15-4737-8813-9e1a7021a599" (UID: "b227b01e-aa15-4737-8813-9e1a7021a599"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:21:16.822200 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.822184 2572 scope.go:117] "RemoveContainer" containerID="f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081" Apr 17 17:21:16.829348 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.829334 2572 scope.go:117] "RemoveContainer" containerID="9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04" Apr 17 17:21:16.829594 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:21:16.829573 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04\": container with ID starting with 9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04 not found: ID does not exist" containerID="9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04" Apr 17 17:21:16.829638 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.829607 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04"} err="failed to get container status \"9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04\": rpc error: code = NotFound desc = could not find container \"9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04\": container with ID starting with 9ff03e4315f6e74266e423d7b4fd7079905ca8ac8f98d6535f7e2ad8bb051b04 not found: ID does not exist" Apr 17 17:21:16.829638 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.829624 2572 scope.go:117] "RemoveContainer" containerID="f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081" Apr 17 17:21:16.829885 ip-10-0-138-137 kubenswrapper[2572]: E0417 17:21:16.829868 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081\": container with ID starting with f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081 not found: ID does not exist" containerID="f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081" Apr 17 17:21:16.829958 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.829890 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081"} err="failed to get container status \"f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081\": rpc error: code = NotFound desc = could not find container \"f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081\": container with ID starting with f48f16279b3927f812e41930083957d7809d2a0385cb5fc16576a9507a1e9081 not found: ID does not exist" Apr 17 17:21:16.862364 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.862334 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 17 17:21:16.918894 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.918867 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b227b01e-aa15-4737-8813-9e1a7021a599-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:16.918894 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.918886 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nl9g5\" (UniqueName: \"kubernetes.io/projected/b227b01e-aa15-4737-8813-9e1a7021a599-kube-api-access-nl9g5\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:16.919028 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:16.918899 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b227b01e-aa15-4737-8813-9e1a7021a599-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:17.135990 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.135966 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz"] Apr 17 17:21:17.142117 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.142088 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz"] Apr 17 17:21:17.227015 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.226987 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" path="/var/lib/kubelet/pods/b227b01e-aa15-4737-8813-9e1a7021a599/volumes" Apr 17 17:21:17.258663 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.258631 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:21:17.321629 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.321601 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " Apr 17 17:21:17.321786 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.321687 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls\") pod \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " Apr 17 17:21:17.321786 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.321723 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qtjz\" (UniqueName: \"kubernetes.io/projected/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-kube-api-access-7qtjz\") pod \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\" (UID: \"b3e64c5f-1499-4d3c-b9b1-6a09295446b4\") " Apr 17 17:21:17.321973 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.321947 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-error-404-isvc-6941b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6941b-kube-rbac-proxy-sar-config") pod "b3e64c5f-1499-4d3c-b9b1-6a09295446b4" (UID: "b3e64c5f-1499-4d3c-b9b1-6a09295446b4"). InnerVolumeSpecName "error-404-isvc-6941b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:21:17.323681 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.323661 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-kube-api-access-7qtjz" (OuterVolumeSpecName: "kube-api-access-7qtjz") pod "b3e64c5f-1499-4d3c-b9b1-6a09295446b4" (UID: "b3e64c5f-1499-4d3c-b9b1-6a09295446b4"). InnerVolumeSpecName "kube-api-access-7qtjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:21:17.323752 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.323708 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b3e64c5f-1499-4d3c-b9b1-6a09295446b4" (UID: "b3e64c5f-1499-4d3c-b9b1-6a09295446b4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:21:17.422749 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.422686 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qtjz\" (UniqueName: \"kubernetes.io/projected/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-kube-api-access-7qtjz\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:17.422749 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.422710 2572 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:17.422749 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.422721 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3e64c5f-1499-4d3c-b9b1-6a09295446b4-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:17.817592 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.817565 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" Apr 17 17:21:17.817802 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.817560 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424" event={"ID":"b3e64c5f-1499-4d3c-b9b1-6a09295446b4","Type":"ContainerDied","Data":"996baae66be7edb90a421bbc743c72ca69dc3b368dc81f409a29bca7ce3d1cd0"} Apr 17 17:21:17.817802 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.817704 2572 scope.go:117] "RemoveContainer" containerID="5279e05e3a9f4bb29483cc8522cff1bc6baeb05904ae490a13691bc8611f01b4" Apr 17 17:21:17.828313 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.828280 2572 scope.go:117] "RemoveContainer" containerID="7a2f206425c2e16462b405062e023b2e5641643672d73764e06d667bb7da6626" Apr 17 17:21:17.839330 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.839309 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424"] Apr 17 17:21:17.843568 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:17.843548 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424"] Apr 17 17:21:19.226130 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:19.226095 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" path="/var/lib/kubelet/pods/b3e64c5f-1499-4d3c-b9b1-6a09295446b4/volumes" Apr 17 17:21:21.106319 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:21.106280 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:21:26.106468 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:26.106429 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:21:26.106945 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:26.106526 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:21:28.394154 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:28.394125 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:29.227761 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:29.227731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:30.062626 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:30.062590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:30.874663 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:30.874615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:31.105951 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:31.105914 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:21:31.691423 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:31.691386 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:32.523809 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:32.523758 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:33.376014 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:33.375984 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:34.201952 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:34.201893 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:35.014829 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:35.014770 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:35.817495 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:35.817467 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:36.106442 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:36.106351 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:21:36.629868 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:36.629839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:37.474565 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:37.474535 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-6941b-869c44b684-24nwx_c9efa573-7e51-407e-9a19-731c1efdda17/switch-graph-6941b/0.log" Apr 17 17:21:39.799831 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.799800 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmwvb/must-gather-89kqn"] Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800065 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800075 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800085 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800090 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800099 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800105 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800114 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800127 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800132 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800139 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800143 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800150 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800155 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800161 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800165 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kube-rbac-proxy" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800174 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" Apr 17 17:21:39.800189 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800181 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800223 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kube-rbac-proxy" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800230 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kserve-container" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800236 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kserve-container" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800244 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b227b01e-aa15-4737-8813-9e1a7021a599" containerName="kube-rbac-proxy" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800251 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd6d8cec-e328-49cc-a077-8d0b71e06a76" containerName="splitter-graph-7afa6" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800258 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kserve-container" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800264 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3e64c5f-1499-4d3c-b9b1-6a09295446b4" containerName="kube-rbac-proxy" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800272 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0086f8ec-cd25-4666-a36c-d27f5ee1ca4f" containerName="kserve-container" Apr 17 17:21:39.800800 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.800277 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="16af9bfe-cb61-44e9-bb37-385d3eefa146" containerName="kube-rbac-proxy" Apr 17 17:21:39.804564 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.804543 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:39.807257 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.807238 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmwvb\"/\"openshift-service-ca.crt\"" Apr 17 17:21:39.807376 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.807358 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmwvb\"/\"kube-root-ca.crt\"" Apr 17 17:21:39.807754 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.807742 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmwvb\"/\"default-dockercfg-dkhbl\"" Apr 17 17:21:39.814824 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.814806 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/must-gather-89kqn"] Apr 17 17:21:39.888984 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.888958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltg6\" (UniqueName: \"kubernetes.io/projected/052e1009-de21-4bc2-b873-5b614c9b56e1-kube-api-access-tltg6\") pod \"must-gather-89kqn\" (UID: \"052e1009-de21-4bc2-b873-5b614c9b56e1\") " pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:39.889133 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.888994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/052e1009-de21-4bc2-b873-5b614c9b56e1-must-gather-output\") pod \"must-gather-89kqn\" (UID: \"052e1009-de21-4bc2-b873-5b614c9b56e1\") " pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:39.990073 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.990040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tltg6\" (UniqueName: \"kubernetes.io/projected/052e1009-de21-4bc2-b873-5b614c9b56e1-kube-api-access-tltg6\") pod \"must-gather-89kqn\" (UID: \"052e1009-de21-4bc2-b873-5b614c9b56e1\") " pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:39.990073 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.990074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/052e1009-de21-4bc2-b873-5b614c9b56e1-must-gather-output\") pod \"must-gather-89kqn\" (UID: \"052e1009-de21-4bc2-b873-5b614c9b56e1\") " pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:39.990373 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.990357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/052e1009-de21-4bc2-b873-5b614c9b56e1-must-gather-output\") pod \"must-gather-89kqn\" (UID: \"052e1009-de21-4bc2-b873-5b614c9b56e1\") " pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:39.998039 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:39.998021 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltg6\" (UniqueName: \"kubernetes.io/projected/052e1009-de21-4bc2-b873-5b614c9b56e1-kube-api-access-tltg6\") pod \"must-gather-89kqn\" (UID: \"052e1009-de21-4bc2-b873-5b614c9b56e1\") " pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:40.113474 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:40.113408 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/must-gather-89kqn" Apr 17 17:21:40.229706 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:40.229682 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/must-gather-89kqn"] Apr 17 17:21:40.232419 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:21:40.232396 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod052e1009_de21_4bc2_b873_5b614c9b56e1.slice/crio-4f35b4c3e9a7ff6750e8941e1e1b22cfb700fc0468da569449a02d13fd7bd1b4 WatchSource:0}: Error finding container 4f35b4c3e9a7ff6750e8941e1e1b22cfb700fc0468da569449a02d13fd7bd1b4: Status 404 returned error can't find the container with id 4f35b4c3e9a7ff6750e8941e1e1b22cfb700fc0468da569449a02d13fd7bd1b4 Apr 17 17:21:40.234481 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:40.234460 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:21:40.880091 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:40.880053 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/must-gather-89kqn" event={"ID":"052e1009-de21-4bc2-b873-5b614c9b56e1","Type":"ContainerStarted","Data":"4f35b4c3e9a7ff6750e8941e1e1b22cfb700fc0468da569449a02d13fd7bd1b4"} Apr 17 17:21:41.105735 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:41.105692 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:21:41.884321 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:41.884286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/must-gather-89kqn" event={"ID":"052e1009-de21-4bc2-b873-5b614c9b56e1","Type":"ContainerStarted","Data":"26ccb371d70bb96118282977439d72cf69b449ae9e95e3067f08b35678dbdcb5"} Apr 17 17:21:41.884321 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:41.884326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/must-gather-89kqn" event={"ID":"052e1009-de21-4bc2-b873-5b614c9b56e1","Type":"ContainerStarted","Data":"c50014f4834fa68e3203fd1ee614b3d682c7d21e0eeff59fdfa1446e9f25810e"} Apr 17 17:21:41.909666 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:41.909608 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmwvb/must-gather-89kqn" podStartSLOduration=2.009708131 podStartE2EDuration="2.90959048s" podCreationTimestamp="2026-04-17 17:21:39 +0000 UTC" firstStartedPulling="2026-04-17 17:21:40.234584889 +0000 UTC m=+3033.583214419" lastFinishedPulling="2026-04-17 17:21:41.134467241 +0000 UTC m=+3034.483096768" observedRunningTime="2026-04-17 17:21:41.906926782 +0000 UTC m=+3035.255556334" watchObservedRunningTime="2026-04-17 17:21:41.90959048 +0000 UTC m=+3035.258220028" Apr 17 17:21:42.742570 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:42.742538 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wszrw_385f5d13-97af-4215-9e30-c75e4ad792b1/global-pull-secret-syncer/0.log" Apr 17 17:21:42.804934 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:42.804903 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8hctm_95c0a6c1-6228-4af3-acf9-47e2e061f7bf/konnectivity-agent/0.log" Apr 17 17:21:42.967467 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:42.967440 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-137.ec2.internal_692377e0184489441f39fe1105bfb2c4/haproxy/0.log" Apr 17 17:21:43.893133 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:43.893091 2572 generic.go:358] "Generic (PLEG): container finished" podID="c9efa573-7e51-407e-9a19-731c1efdda17" containerID="f30afc9378bc74e10fb8d7c8531f4b59b5b8e6c9cdabdf01b8c5d26da650ff33" exitCode=137 Apr 17 17:21:43.893352 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:43.893195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" event={"ID":"c9efa573-7e51-407e-9a19-731c1efdda17","Type":"ContainerDied","Data":"f30afc9378bc74e10fb8d7c8531f4b59b5b8e6c9cdabdf01b8c5d26da650ff33"} Apr 17 17:21:44.615772 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.612072 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:21:44.645078 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.644465 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9efa573-7e51-407e-9a19-731c1efdda17-openshift-service-ca-bundle\") pod \"c9efa573-7e51-407e-9a19-731c1efdda17\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " Apr 17 17:21:44.645078 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.644547 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9efa573-7e51-407e-9a19-731c1efdda17-proxy-tls\") pod \"c9efa573-7e51-407e-9a19-731c1efdda17\" (UID: \"c9efa573-7e51-407e-9a19-731c1efdda17\") " Apr 17 17:21:44.645726 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.645677 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9efa573-7e51-407e-9a19-731c1efdda17-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c9efa573-7e51-407e-9a19-731c1efdda17" (UID: "c9efa573-7e51-407e-9a19-731c1efdda17"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:21:44.651066 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.651029 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9efa573-7e51-407e-9a19-731c1efdda17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c9efa573-7e51-407e-9a19-731c1efdda17" (UID: "c9efa573-7e51-407e-9a19-731c1efdda17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:21:44.745594 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.745527 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9efa573-7e51-407e-9a19-731c1efdda17-openshift-service-ca-bundle\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:44.745594 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.745565 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9efa573-7e51-407e-9a19-731c1efdda17-proxy-tls\") on node \"ip-10-0-138-137.ec2.internal\" DevicePath \"\"" Apr 17 17:21:44.909341 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.909244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" event={"ID":"c9efa573-7e51-407e-9a19-731c1efdda17","Type":"ContainerDied","Data":"e7951b3b1d85fd96896b6300ca0de50be066e1d9f03173c0a16c7757f479f664"} Apr 17 17:21:44.909341 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.909305 2572 scope.go:117] "RemoveContainer" containerID="f30afc9378bc74e10fb8d7c8531f4b59b5b8e6c9cdabdf01b8c5d26da650ff33" Apr 17 17:21:44.909571 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.909461 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx" Apr 17 17:21:44.958676 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.954855 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx"] Apr 17 17:21:44.958676 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:44.958338 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx"] Apr 17 17:21:45.228218 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:45.228172 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" path="/var/lib/kubelet/pods/c9efa573-7e51-407e-9a19-731c1efdda17/volumes" Apr 17 17:21:46.435022 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.434988 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-j2cfd_e3d2b560-1dd2-4c10-adba-974388d9af49/kube-state-metrics/0.log" Apr 17 17:21:46.465791 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.465757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-j2cfd_e3d2b560-1dd2-4c10-adba-974388d9af49/kube-rbac-proxy-main/0.log" Apr 17 17:21:46.493368 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.493336 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-j2cfd_e3d2b560-1dd2-4c10-adba-974388d9af49/kube-rbac-proxy-self/0.log" Apr 17 17:21:46.523313 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.523278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7b855889f-bcqdb_3f3c472e-29e8-4e7f-85de-f2381c6f9adc/metrics-server/0.log" Apr 17 17:21:46.769930 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.769898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sqfzq_6a033cfb-a232-46c8-8118-206ada51e43f/node-exporter/0.log" Apr 17 17:21:46.796561 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.796537 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sqfzq_6a033cfb-a232-46c8-8118-206ada51e43f/kube-rbac-proxy/0.log" Apr 17 17:21:46.820039 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.820011 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sqfzq_6a033cfb-a232-46c8-8118-206ada51e43f/init-textfile/0.log" Apr 17 17:21:46.854710 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.854680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-l6rvm_a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4/kube-rbac-proxy-main/0.log" Apr 17 17:21:46.879611 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.879574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-l6rvm_a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4/kube-rbac-proxy-self/0.log" Apr 17 17:21:46.908527 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:46.908485 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-l6rvm_a2d8a523-9ad8-4c35-ae64-4f46dfbb6ce4/openshift-state-metrics/0.log" Apr 17 17:21:49.525162 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.525124 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb"] Apr 17 17:21:49.525603 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.525428 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" Apr 17 17:21:49.525603 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.525440 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" Apr 17 17:21:49.525603 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.525483 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9efa573-7e51-407e-9a19-731c1efdda17" containerName="switch-graph-6941b" Apr 17 17:21:49.528729 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.528711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.539856 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.539832 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb"] Apr 17 17:21:49.589775 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.589739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pzq\" (UniqueName: \"kubernetes.io/projected/f0383c23-d4b4-4e62-9acf-9a9302261bbc-kube-api-access-z8pzq\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.589962 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.589785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-sys\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.589962 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.589853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-lib-modules\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.589962 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.589930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-proc\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.589962 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.589957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-podres\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691250 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pzq\" (UniqueName: \"kubernetes.io/projected/f0383c23-d4b4-4e62-9acf-9a9302261bbc-kube-api-access-z8pzq\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691250 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-sys\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-lib-modules\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-proc\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-podres\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-sys\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-proc\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-lib-modules\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.691501 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.691445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f0383c23-d4b4-4e62-9acf-9a9302261bbc-podres\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.699852 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.699821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pzq\" (UniqueName: \"kubernetes.io/projected/f0383c23-d4b4-4e62-9acf-9a9302261bbc-kube-api-access-z8pzq\") pod \"perf-node-gather-daemonset-nhwnb\" (UID: \"f0383c23-d4b4-4e62-9acf-9a9302261bbc\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.838848 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.838784 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:49.983033 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:49.983004 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb"] Apr 17 17:21:49.985527 ip-10-0-138-137 kubenswrapper[2572]: W0417 17:21:49.985489 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf0383c23_d4b4_4e62_9acf_9a9302261bbc.slice/crio-9cc614fc49dec325df5719bbcda24f424071a4f1d772bfd39ccac5bb587ef005 WatchSource:0}: Error finding container 9cc614fc49dec325df5719bbcda24f424071a4f1d772bfd39ccac5bb587ef005: Status 404 returned error can't find the container with id 9cc614fc49dec325df5719bbcda24f424071a4f1d772bfd39ccac5bb587ef005 Apr 17 17:21:50.591270 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.591219 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lg6l4_5b139190-3fc0-4860-9067-f972c93db541/dns/0.log" Apr 17 17:21:50.619209 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.619177 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lg6l4_5b139190-3fc0-4860-9067-f972c93db541/kube-rbac-proxy/0.log" Apr 17 17:21:50.730509 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.730484 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cbmjr_9748097b-af4c-40c0-b6c6-261863bca7b4/dns-node-resolver/0.log" Apr 17 17:21:50.930556 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.930471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" event={"ID":"f0383c23-d4b4-4e62-9acf-9a9302261bbc","Type":"ContainerStarted","Data":"24b648a485f36a3841eccdb3a9e535a89621d505802d702e8b26d917148d5319"} Apr 17 17:21:50.930556 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.930508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" event={"ID":"f0383c23-d4b4-4e62-9acf-9a9302261bbc","Type":"ContainerStarted","Data":"9cc614fc49dec325df5719bbcda24f424071a4f1d772bfd39ccac5bb587ef005"} Apr 17 17:21:50.930769 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.930626 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:21:50.948186 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:50.948138 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" podStartSLOduration=1.948122418 podStartE2EDuration="1.948122418s" podCreationTimestamp="2026-04-17 17:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:21:50.94752908 +0000 UTC m=+3044.296158630" watchObservedRunningTime="2026-04-17 17:21:50.948122418 +0000 UTC m=+3044.296751973" Apr 17 17:21:51.208882 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:51.208856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2h7tr_1b4367fb-4aa5-4cb9-a2b9-b50cbfeedf45/node-ca/0.log" Apr 17 17:21:51.986798 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:51.986748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66bdc45668-2hj6k_2af6331b-785c-4d64-a991-6f486304bebf/router/0.log" Apr 17 17:21:52.386325 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:52.386251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p5rm4_bf83b7da-8d7a-47cb-873b-aa2f7b647ff9/serve-healthcheck-canary/0.log" Apr 17 17:21:52.853929 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:52.853899 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8hx2q_52f90196-09f6-4839-b183-e5c1731d4d40/kube-rbac-proxy/0.log" Apr 17 17:21:52.879257 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:52.879228 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8hx2q_52f90196-09f6-4839-b183-e5c1731d4d40/exporter/0.log" Apr 17 17:21:52.905548 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:52.905515 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8hx2q_52f90196-09f6-4839-b183-e5c1731d4d40/extractor/0.log" Apr 17 17:21:55.207410 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:55.207380 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-pczd5_d1ed091d-6054-47d0-8770-ad78f8a1729e/server/0.log" Apr 17 17:21:56.944103 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:21:56.944077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-nhwnb" Apr 17 17:22:01.262244 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.262211 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/kube-multus-additional-cni-plugins/0.log" Apr 17 17:22:01.287603 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.287581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/egress-router-binary-copy/0.log" Apr 17 17:22:01.309979 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.309953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/cni-plugins/0.log" Apr 17 17:22:01.335809 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.335748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/bond-cni-plugin/0.log" Apr 17 17:22:01.358792 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.358770 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/routeoverride-cni/0.log" Apr 17 17:22:01.381634 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.381611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/whereabouts-cni-bincopy/0.log" Apr 17 17:22:01.409005 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.408978 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bdk96_5cd8687d-ad01-456f-b5f8-9c49b1c2488b/whereabouts-cni/0.log" Apr 17 17:22:01.661257 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.661173 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fkm8h_c0d53844-6ca3-4f97-9404-6cec628fe368/kube-multus/0.log" Apr 17 17:22:01.847667 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.847626 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-w6ttr_000f5549-91dd-4651-b5a0-21769e3982f4/network-metrics-daemon/0.log" Apr 17 17:22:01.870014 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:01.869990 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-w6ttr_000f5549-91dd-4651-b5a0-21769e3982f4/kube-rbac-proxy/0.log" Apr 17 17:22:03.393926 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.393893 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/ovn-controller/0.log" Apr 17 17:22:03.440015 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.439985 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/ovn-acl-logging/0.log" Apr 17 17:22:03.467397 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.467375 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/kube-rbac-proxy-node/0.log" Apr 17 17:22:03.493251 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.493222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:22:03.511332 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.511310 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/northd/0.log" Apr 17 17:22:03.536439 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.536419 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/nbdb/0.log" Apr 17 17:22:03.561808 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.561778 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/sbdb/0.log" Apr 17 17:22:03.739542 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:03.739510 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gjfdq_e1faab54-1fd0-4e7f-8959-5e580bbd833d/ovnkube-controller/0.log" Apr 17 17:22:04.703842 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:04.703812 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bbt78_4ab33527-9aec-4272-9cb2-4f84af38a336/network-check-target-container/0.log" Apr 17 17:22:05.684403 ip-10-0-138-137 kubenswrapper[2572]: I0417 17:22:05.684368 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-pxs2x_3fa6420e-46d0-4ade-82cb-f5e03e235d26/iptables-alerter/0.log"