Apr 23 17:50:51.797659 ip-10-0-143-131 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 17:50:51.797674 ip-10-0-143-131 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 17:50:51.797682 ip-10-0-143-131 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 17:50:51.797983 ip-10-0-143-131 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 17:51:02.044145 ip-10-0-143-131 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 17:51:02.044167 ip-10-0-143-131 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0fa312e6329c4b43b62658c0e976b94b -- Apr 23 17:53:32.321194 ip-10-0-143-131 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:53:32.821846 ip-10-0-143-131 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:53:32.821846 ip-10-0-143-131 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:53:32.821846 ip-10-0-143-131 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:53:32.821846 ip-10-0-143-131 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:53:32.821846 ip-10-0-143-131 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:53:32.825688 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.825600 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:53:32.830978 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830948 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:32.830978 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830968 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:32.830978 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830973 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:32.830978 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830978 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:32.830978 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830984 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830989 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830994 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.830997 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831002 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831005 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831009 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831013 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831017 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831020 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831024 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831028 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831032 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831035 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831039 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831042 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831047 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831050 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831054 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831058 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:32.831264 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831061 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831065 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831069 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831073 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831082 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831086 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831089 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831093 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831097 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831101 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831105 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831109 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831113 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831117 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831122 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831128 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831132 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831137 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831141 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831144 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:32.832250 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831148 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831153 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831157 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831161 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831166 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831170 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831174 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831178 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831182 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831187 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831191 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831195 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831199 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831203 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831209 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831216 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831220 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831224 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831228 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831232 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:32.832869 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831236 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831240 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831244 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831248 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831252 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831256 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831261 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831265 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831269 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831273 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831279 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831286 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831293 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831298 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831314 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831320 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831325 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831329 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831334 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:32.833412 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831339 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831343 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.831347 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832039 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832048 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832053 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832058 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832062 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832066 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832070 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832074 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832078 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832083 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832087 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832090 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832095 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832099 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832103 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832107 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832113 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:32.833873 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832117 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832121 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832125 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832130 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832134 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832138 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832142 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832155 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832161 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832166 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832170 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832174 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832178 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832183 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832187 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832190 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832194 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832198 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832202 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832206 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:32.834615 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832210 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832214 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832218 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832222 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832226 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832230 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832234 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832238 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832243 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832246 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832250 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832255 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832259 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832262 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832266 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832271 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832275 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832279 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832283 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832287 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:32.835573 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832299 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832304 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832309 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832313 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832317 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832321 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832326 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832329 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832334 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832338 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832342 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832347 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832351 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832355 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832359 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832363 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832367 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832371 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832375 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832379 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:32.836173 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832383 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832392 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832397 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832403 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832408 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832413 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832418 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832422 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.832427 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834106 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834122 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834139 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834145 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834160 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834165 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834172 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834179 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834184 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834189 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834194 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834199 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:53:32.836919 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834204 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834209 2575 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834213 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834218 2575 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834223 2575 flags.go:64] FLAG: --cloud-config="" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834227 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834232 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834242 2575 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834247 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834252 2575 flags.go:64] FLAG: --config-dir="" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834257 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834263 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834269 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834273 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834278 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834283 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834288 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834292 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834297 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834302 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834307 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834313 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834318 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834322 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834327 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:53:32.837442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834340 2575 flags.go:64] FLAG: --enable-server="true" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834345 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834355 2575 flags.go:64] FLAG: --event-burst="100" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834360 2575 flags.go:64] FLAG: --event-qps="50" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834364 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834369 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834374 2575 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834380 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834384 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834389 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834395 2575 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834399 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834404 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834408 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834413 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834428 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834433 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834438 2575 flags.go:64] FLAG: --feature-gates="" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834444 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834449 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834454 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834459 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834464 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834468 2575 flags.go:64] FLAG: --help="false" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834473 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-143-131.ec2.internal" Apr 23 17:53:32.838148 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834478 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834483 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834487 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834493 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834499 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834504 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834508 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834513 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834525 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834530 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834535 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834539 2575 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834544 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834548 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834553 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834557 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834562 2575 flags.go:64] FLAG: --lock-file="" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834566 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834571 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834575 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834584 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834589 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834594 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:53:32.838785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834599 2575 flags.go:64] FLAG: --logging-format="text" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834603 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834609 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834613 2575 flags.go:64] FLAG: --manifest-url="" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834617 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834623 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834627 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834633 2575 flags.go:64] FLAG: --max-pods="110" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834638 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834643 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834648 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834652 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834657 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834663 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834667 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834678 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834682 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834688 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834701 2575 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834705 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834713 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834717 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834722 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834727 2575 flags.go:64] FLAG: --port="10250" Apr 23 17:53:32.839368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834731 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834736 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05444f870784e4006" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834741 2575 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834745 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834750 2575 flags.go:64] FLAG: --register-node="true" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834754 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834759 2575 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834765 2575 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834770 2575 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834774 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834779 2575 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834785 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834789 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834794 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834798 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834803 2575 flags.go:64] FLAG: --runonce="false" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834807 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834812 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834817 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834821 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834825 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834830 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834835 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834839 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834843 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834848 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:53:32.839976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834853 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834865 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834870 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834875 2575 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834899 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834909 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834913 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834917 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834935 2575 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834939 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834944 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834948 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834953 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834958 2575 flags.go:64] FLAG: --v="2" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834965 2575 flags.go:64] FLAG: --version="false" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834972 2575 flags.go:64] FLAG: --vmodule="" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834979 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.834984 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835170 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835177 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835182 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835187 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835191 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:32.840581 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835195 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835200 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835204 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835211 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835217 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835222 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835227 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835231 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835235 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835240 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835246 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835259 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835263 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835268 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835274 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835279 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835284 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835289 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835293 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:32.841192 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835297 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835302 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835305 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835310 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835314 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835318 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835322 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835326 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835330 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835335 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835339 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835343 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835348 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835352 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835356 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835360 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835364 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835369 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835372 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835377 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:32.841699 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835381 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835384 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835393 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835397 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835401 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835412 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835417 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835421 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835425 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835429 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835433 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835437 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835441 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835445 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835450 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835454 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835458 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835462 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835466 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835470 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:32.842224 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835474 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835478 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835481 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835485 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835489 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835493 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835497 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835502 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835506 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835510 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835514 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835518 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835522 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835526 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835532 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835536 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835540 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835544 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835551 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835555 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:32.842727 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835559 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.835563 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.835571 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.842919 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.842944 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843003 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843009 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843012 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843015 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843019 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843022 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843025 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843029 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843032 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843035 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843038 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:32.843244 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843041 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843043 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843046 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843049 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843053 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843056 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843060 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843063 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843066 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843069 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843072 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843075 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843077 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843080 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843083 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843086 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843089 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843091 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843094 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843097 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:32.843653 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843101 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843104 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843107 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843109 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843112 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843114 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843117 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843120 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843122 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843124 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843127 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843130 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843132 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843135 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843137 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843140 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843142 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843145 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843148 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843151 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:32.844148 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843154 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843157 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843160 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843163 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843165 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843168 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843170 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843173 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843176 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843178 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843181 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843183 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843186 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843192 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843195 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843197 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843200 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843202 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843205 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843207 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:32.844646 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843209 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843212 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843214 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843217 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843219 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843222 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843224 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843226 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843229 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843231 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843233 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843236 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843239 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843241 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843244 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:32.845209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.843249 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843384 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843389 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843393 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843395 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843398 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843400 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843404 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843408 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843411 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843414 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843422 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843425 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843428 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843431 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843434 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843437 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843439 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843442 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:32.845585 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843445 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843447 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843450 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843452 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843455 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843457 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843460 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843462 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843465 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843468 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843470 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843473 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843475 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843477 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843480 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843483 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843487 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843490 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843492 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843495 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:32.846074 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843498 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843501 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843503 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843506 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843508 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843516 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843519 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843521 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843524 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843526 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843529 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843531 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843534 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843536 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843539 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843541 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843543 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843546 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843548 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:32.846565 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843551 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843553 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843556 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843558 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843561 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843563 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843565 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843567 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843570 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843573 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843575 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843578 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843580 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843583 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843585 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843588 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843590 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843592 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843595 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843603 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:32.847063 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843605 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843608 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843611 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843613 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843615 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843618 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843620 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843623 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:32.843625 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.843630 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:53:32.847541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.844571 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:53:32.847804 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.847728 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:53:32.848827 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.848815 2575 server.go:1019] "Starting client certificate rotation" Apr 23 17:53:32.848933 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.848916 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:53:32.848970 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.848961 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:53:32.877464 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.877445 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:53:32.882234 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.882209 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:53:32.898387 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.898366 2575 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:53:32.904537 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.904523 2575 log.go:25] "Validated CRI v1 image API" Apr 23 17:53:32.905781 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.905755 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:53:32.908331 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.908313 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c529bb05-7e0e-40f3-ad27-f66f094cc249:/dev/nvme0n1p4 d4cf3ab7-2bd3-4a37-8a44-888fc7734bc2:/dev/nvme0n1p3] Apr 23 17:53:32.908392 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.908332 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:53:32.911994 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.911975 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:53:32.914367 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.914249 2575 manager.go:217] Machine: {Timestamp:2026-04-23 17:53:32.91222819 +0000 UTC m=+0.458203551 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103826 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29a5fc39f03f56da3312fbf659d760 SystemUUID:ec29a5fc-39f0-3f56-da33-12fbf659d760 BootID:0fa312e6-329c-4b43-b626-58c0e976b94b Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:43:a9:9d:ad:ed Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:43:a9:9d:ad:ed Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:38:e3:8d:99:da Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:53:32.914367 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.914361 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:53:32.914486 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.914443 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:53:32.916874 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.916850 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:53:32.917048 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.916876 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-131.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:53:32.917142 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.917060 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:53:32.917142 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.917072 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:53:32.917142 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.917089 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:53:32.918140 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.918129 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:53:32.919614 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.919601 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:53:32.919924 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.919912 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:53:32.922800 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.922789 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:53:32.922857 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.922808 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:53:32.922857 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.922824 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:53:32.922857 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.922837 2575 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:53:32.922857 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.922848 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:53:32.924014 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.924001 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:53:32.924088 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.924024 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:53:32.927623 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.927607 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:53:32.929013 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.929000 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:53:32.930998 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.930982 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:53:32.930998 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931000 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931006 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931012 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931017 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931023 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931029 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931034 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931041 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931047 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931056 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:53:32.931100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931065 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:53:32.931980 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931969 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:53:32.931980 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.931980 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:53:32.934516 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:32.934476 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-131.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:53:32.934516 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:32.934480 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:53:32.934700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.934673 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2l42b" Apr 23 17:53:32.936251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.936234 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:53:32.936333 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.936276 2575 server.go:1295] "Started kubelet" Apr 23 17:53:32.936415 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.936350 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:53:32.936450 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.936441 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:53:32.937029 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.936971 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:53:32.937169 ip-10-0-143-131 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:53:32.937690 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.937657 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:53:32.941227 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.941203 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-131.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:32.942013 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.941998 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:53:32.944806 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.944784 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2l42b" Apr 23 17:53:32.945409 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:32.944267 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-131.ec2.internal.18a90de6ceb92521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-131.ec2.internal,UID:ip-10-0-143-131.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-131.ec2.internal,},FirstTimestamp:2026-04-23 17:53:32.936250657 +0000 UTC m=+0.482226018,LastTimestamp:2026-04-23 17:53:32.936250657 +0000 UTC m=+0.482226018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-131.ec2.internal,}" Apr 23 17:53:32.948027 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948010 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:53:32.948593 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948110 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:53:32.948751 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948733 2575 factory.go:55] Registering systemd factory Apr 23 17:53:32.948812 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948761 2575 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:53:32.948812 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948790 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:53:32.948812 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948809 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:53:32.948976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948919 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:53:32.948976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948967 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:53:32.948976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.948975 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:53:32.949214 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:32.949194 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:32.949314 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.949299 2575 factory.go:153] Registering CRI-O factory Apr 23 17:53:32.949372 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.949319 2575 factory.go:223] Registration of the crio container factory successfully Apr 23 17:53:32.949372 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.949367 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:53:32.949465 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.949386 2575 factory.go:103] Registering Raw factory Apr 23 17:53:32.949465 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.949403 2575 manager.go:1196] Started watching for new ooms in manager Apr 23 17:53:32.949745 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.949731 2575 manager.go:319] Starting recovery of all containers Apr 23 17:53:32.950137 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:32.950105 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:53:32.958335 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.958156 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:32.958825 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.958812 2575 manager.go:324] Recovery completed Apr 23 17:53:32.962223 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:32.962197 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-131.ec2.internal\" not found" node="ip-10-0-143-131.ec2.internal" Apr 23 17:53:32.963924 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.963908 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:32.966386 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.966370 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:32.966459 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.966403 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:32.966459 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.966417 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:32.966932 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.966907 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:53:32.966932 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.966920 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:53:32.967041 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.966946 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:53:32.969660 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.969646 2575 policy_none.go:49] "None policy: Start" Apr 23 17:53:32.969710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.969665 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:53:32.969710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:32.969675 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:53:33.014528 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.014508 2575 manager.go:341] "Starting Device Plugin manager" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.014550 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.014563 2575 server.go:85] "Starting device plugin registration server" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.014826 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.014837 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.014930 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.015008 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.015017 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.015667 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:53:33.026199 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.015705 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.055079 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.055035 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:53:33.056398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.056377 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:53:33.056517 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.056410 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:53:33.056517 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.056432 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:53:33.056517 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.056438 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:53:33.056517 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.056473 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:53:33.059441 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.059421 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:33.115773 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.115691 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:33.116871 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.116850 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:33.116980 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.116895 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:33.116980 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.116907 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:33.116980 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.116930 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.125307 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.125291 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.125403 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.125312 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-131.ec2.internal\": node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.141895 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.141852 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.157303 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.157281 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal"] Apr 23 17:53:33.157374 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.157363 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:33.158179 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.158159 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:33.158259 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.158191 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:33.158259 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.158205 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:33.159703 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.159691 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:33.159868 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.159855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.159923 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.159899 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:33.160374 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.160359 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:33.160427 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.160370 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:33.160427 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.160391 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:33.160427 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.160392 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:33.160427 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.160404 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:33.160427 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.160412 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:33.161681 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.161667 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.161737 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.161696 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:33.162509 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.162483 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:33.162581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.162521 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:33.162581 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.162537 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:33.181410 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.181385 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-131.ec2.internal\" not found" node="ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.185585 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.185568 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-131.ec2.internal\" not found" node="ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.241972 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.241950 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.250304 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.250282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7afc4ee7d1562afa8c49a2f94a2daa59-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal\" (UID: \"7afc4ee7d1562afa8c49a2f94a2daa59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.342183 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.342158 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.350434 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.350411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7afc4ee7d1562afa8c49a2f94a2daa59-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal\" (UID: \"7afc4ee7d1562afa8c49a2f94a2daa59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.350485 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.350442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/79e815642fa2b5d10caaca9635cb7db2-config\") pod \"kube-apiserver-proxy-ip-10-0-143-131.ec2.internal\" (UID: \"79e815642fa2b5d10caaca9635cb7db2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.350519 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.350503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7afc4ee7d1562afa8c49a2f94a2daa59-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal\" (UID: \"7afc4ee7d1562afa8c49a2f94a2daa59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.350584 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.350572 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7afc4ee7d1562afa8c49a2f94a2daa59-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal\" (UID: \"7afc4ee7d1562afa8c49a2f94a2daa59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.442864 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.442809 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.451105 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.451090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7afc4ee7d1562afa8c49a2f94a2daa59-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal\" (UID: \"7afc4ee7d1562afa8c49a2f94a2daa59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.451151 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.451113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/79e815642fa2b5d10caaca9635cb7db2-config\") pod \"kube-apiserver-proxy-ip-10-0-143-131.ec2.internal\" (UID: \"79e815642fa2b5d10caaca9635cb7db2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.451151 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.451145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/79e815642fa2b5d10caaca9635cb7db2-config\") pod \"kube-apiserver-proxy-ip-10-0-143-131.ec2.internal\" (UID: \"79e815642fa2b5d10caaca9635cb7db2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.451238 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.451192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7afc4ee7d1562afa8c49a2f94a2daa59-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal\" (UID: \"7afc4ee7d1562afa8c49a2f94a2daa59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.483276 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.483246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.487985 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.487969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" Apr 23 17:53:33.542928 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.542874 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.643369 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.643329 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.743911 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.743819 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.844377 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.844341 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.848495 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.848481 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:53:33.848627 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.848612 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:53:33.848681 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.848663 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:53:33.944938 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:33.944912 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:33.947962 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.947932 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:48:32 +0000 UTC" deadline="2027-12-26 14:24:22.69794919 +0000 UTC" Apr 23 17:53:33.947962 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.947953 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14684h30m48.749998601s" Apr 23 17:53:33.948240 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.948225 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:53:33.960647 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.960627 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:53:33.990059 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.990031 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9b69d" Apr 23 17:53:33.997687 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:33.997642 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9b69d" Apr 23 17:53:34.015042 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.015020 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:34.045042 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:34.045010 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:34.113190 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:34.113157 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7afc4ee7d1562afa8c49a2f94a2daa59.slice/crio-5afdba4a85f33b43ba0ca5917a9394dcde627fa326ac2167fe3ea2e27a4b0f44 WatchSource:0}: Error finding container 5afdba4a85f33b43ba0ca5917a9394dcde627fa326ac2167fe3ea2e27a4b0f44: Status 404 returned error can't find the container with id 5afdba4a85f33b43ba0ca5917a9394dcde627fa326ac2167fe3ea2e27a4b0f44 Apr 23 17:53:34.113641 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:34.113614 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e815642fa2b5d10caaca9635cb7db2.slice/crio-2e7d8143fad55589454d6197b6f56bf5f5c695921b0614818afb8d2570d9a40f WatchSource:0}: Error finding container 2e7d8143fad55589454d6197b6f56bf5f5c695921b0614818afb8d2570d9a40f: Status 404 returned error can't find the container with id 2e7d8143fad55589454d6197b6f56bf5f5c695921b0614818afb8d2570d9a40f Apr 23 17:53:34.117162 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.117146 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:53:34.145537 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:34.145516 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:34.246046 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:34.246020 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:34.346561 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:34.346478 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-131.ec2.internal\" not found" Apr 23 17:53:34.366217 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.366188 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:34.448975 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.448945 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" Apr 23 17:53:34.460807 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.460788 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:53:34.461871 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.461859 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" Apr 23 17:53:34.469393 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.469380 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:53:34.924971 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.924938 2575 apiserver.go:52] "Watching apiserver" Apr 23 17:53:34.933766 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.933735 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:53:34.935088 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.935063 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-q429h","kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl","openshift-cluster-node-tuning-operator/tuned-vvgm9","openshift-image-registry/node-ca-vnvg4","openshift-multus/multus-additional-cni-plugins-59dv8","openshift-multus/multus-tk5p8","openshift-multus/network-metrics-daemon-6lhps","openshift-dns/node-resolver-svgk4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal","openshift-network-diagnostics/network-check-target-bc6lb","openshift-network-operator/iptables-alerter-n2jj4","openshift-ovn-kubernetes/ovnkube-node-ldzqx"] Apr 23 17:53:34.937336 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.937308 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.940266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.939201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:53:34.940266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.939599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jl6x2\"" Apr 23 17:53:34.940266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.939784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.940266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.939846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.940266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.939987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:53:34.940266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.940219 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.941419 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.941396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.941522 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.941451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.942066 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.941721 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.942066 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.941849 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bvtlq\"" Apr 23 17:53:34.942738 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.942713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:34.943150 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.943106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:53:34.943285 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.943264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.943348 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.943302 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.943426 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.943407 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8s92q\"" Apr 23 17:53:34.944839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.944431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.944839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.944708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:34.945155 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.945138 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-p7l7k\"" Apr 23 17:53:34.945230 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.945174 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:53:34.945230 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.945186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.945373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.945365 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.947388 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947160 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:34.947388 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:34.947236 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:34.947388 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947341 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:53:34.947572 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947524 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:53:34.947572 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947542 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l2cdq\"" Apr 23 17:53:34.947758 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947741 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mszg9\"" Apr 23 17:53:34.947820 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947793 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:53:34.947874 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.947824 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:53:34.950155 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.949923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:34.950739 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.950720 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:34.951341 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.951320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:34.951812 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.951775 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-28r6m\"" Apr 23 17:53:34.951947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.951860 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.952048 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.951964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.953129 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.952963 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.953129 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.952973 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qg2c5\"" Apr 23 17:53:34.953390 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.953130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:34.953390 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:34.953187 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:34.953390 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.953225 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:34.953390 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.953251 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.954323 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.954079 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:53:34.955024 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.955002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sqdmk\"" Apr 23 17:53:34.955320 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.955300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:53:34.955406 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.955335 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:53:34.955764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.955579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:53:34.955764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.955649 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:53:34.958344 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.958323 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:53:34.959713 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.959696 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:53:34.961214 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9bb70f31-0e60-414a-ac60-8535df8b1ed1-hosts-file\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:34.961310 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e107839a-5af3-4754-a936-a2da378bc464-tmp\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.961310 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bfcf2bd4-6bd8-4072-a1ed-956e23cf9972-konnectivity-ca\") pod \"konnectivity-agent-q429h\" (UID: \"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972\") " pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:34.961418 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-etc-kubernetes\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.961418 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-cnibin\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.961418 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsq5j\" (UniqueName: \"kubernetes.io/projected/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-kube-api-access-dsq5j\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.961571 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-kubernetes\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.961571 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-sys\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.961571 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-kubelet\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.961717 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.961717 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.961848 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlf8g\" (UniqueName: \"kubernetes.io/projected/9bb70f31-0e60-414a-ac60-8535df8b1ed1-kube-api-access-qlf8g\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:34.961923 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-os-release\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.961989 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpf84\" (UniqueName: \"kubernetes.io/projected/33d2ad31-e97b-4649-8936-e50045eda195-kube-api-access-hpf84\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.962041 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.961992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-multus-certs\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.962777 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.962752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-cni-binary-copy\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.962858 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.962800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.962958 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.962863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-sys-fs\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.962958 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.962915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysconfig\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.963068 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.962958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysctl-d\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.963068 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.962979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-system-cni-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.963068 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-conf-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.963209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-system-cni-dir\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.963209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:34.963209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-modprobe-d\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.963461 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-systemd\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.963461 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bfcf2bd4-6bd8-4072-a1ed-956e23cf9972-agent-certs\") pod \"konnectivity-agent-q429h\" (UID: \"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972\") " pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:34.963461 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-os-release\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.963461 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e107839a-5af3-4754-a936-a2da378bc464-etc-tuned\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.963461 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl74w\" (UniqueName: \"kubernetes.io/projected/e107839a-5af3-4754-a936-a2da378bc464-kube-api-access-cl74w\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.963461 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33d2ad31-e97b-4649-8936-e50045eda195-cni-binary-copy\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.963752 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8h8q\" (UniqueName: \"kubernetes.io/projected/6201ae7f-dbb7-4347-a698-89a65766225e-kube-api-access-r8h8q\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:34.963752 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-k8s-cni-cncf-io\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.963752 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69d66352-58de-4bf6-88d8-b5603ccbe8af-serviceca\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:34.963752 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-registration-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.963950 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-cni-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.963950 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33d2ad31-e97b-4649-8936-e50045eda195-multus-daemon-config\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964013 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.963965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jfg\" (UniqueName: \"kubernetes.io/projected/69d66352-58de-4bf6-88d8-b5603ccbe8af-kube-api-access-r9jfg\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:34.964072 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-cni-bin\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964156 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysctl-conf\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.964156 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.964243 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:34.964243 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-socket-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.964243 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-cnibin\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964341 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-socket-dir-parent\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964341 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-hostroot\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964422 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srv5c\" (UniqueName: \"kubernetes.io/projected/ea1db72a-418a-4e3b-89f7-818e445eed4f-kube-api-access-srv5c\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:34.964471 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bb70f31-0e60-414a-ac60-8535df8b1ed1-tmp-dir\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:34.964558 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.964654 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-run\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.964707 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-host\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.964707 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-netns\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964804 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-cni-multus\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:34.964862 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-device-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:34.964936 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-var-lib-kubelet\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.964986 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-lib-modules\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:34.964986 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.964981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69d66352-58de-4bf6-88d8-b5603ccbe8af-host\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:34.999364 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.998745 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:48:33 +0000 UTC" deadline="2027-12-21 22:09:09.158841516 +0000 UTC" Apr 23 17:53:34.999364 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:34.998777 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14572h15m34.160068216s" Apr 23 17:53:35.050396 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.050356 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:53:35.061123 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.061068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" event={"ID":"7afc4ee7d1562afa8c49a2f94a2daa59","Type":"ContainerStarted","Data":"5afdba4a85f33b43ba0ca5917a9394dcde627fa326ac2167fe3ea2e27a4b0f44"} Apr 23 17:53:35.062133 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.062105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" event={"ID":"79e815642fa2b5d10caaca9635cb7db2","Type":"ContainerStarted","Data":"2e7d8143fad55589454d6197b6f56bf5f5c695921b0614818afb8d2570d9a40f"} Apr 23 17:53:35.065465 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-cni-bin\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.065587 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysctl-conf\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.065587 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.065587 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-cni-bin\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.065587 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:35.065587 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-socket-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-cnibin\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-socket-dir-parent\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/617aedf6-9b98-49f4-9aaa-9499484faf5f-iptables-alerter-script\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-hostroot\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysctl-conf\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srv5c\" (UniqueName: \"kubernetes.io/projected/ea1db72a-418a-4e3b-89f7-818e445eed4f-kube-api-access-srv5c\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bb70f31-0e60-414a-ac60-8535df8b1ed1-tmp-dir\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-run\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-host\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.065809 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-cnibin\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-netns\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-cni-multus\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-device-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-var-lib-kubelet\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-slash\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.065997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-ovn\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-lib-modules\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69d66352-58de-4bf6-88d8-b5603ccbe8af-host\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-etc-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9bb70f31-0e60-414a-ac60-8535df8b1ed1-hosts-file\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e107839a-5af3-4754-a936-a2da378bc464-tmp\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bfcf2bd4-6bd8-4072-a1ed-956e23cf9972-konnectivity-ca\") pod \"konnectivity-agent-q429h\" (UID: \"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972\") " pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-etc-kubernetes\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-var-lib-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-cnibin\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.066327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsq5j\" (UniqueName: \"kubernetes.io/projected/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-kube-api-access-dsq5j\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-kubernetes\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-sys\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-kubelet\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066502 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlf8g\" (UniqueName: \"kubernetes.io/projected/9bb70f31-0e60-414a-ac60-8535df8b1ed1-kube-api-access-qlf8g\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-os-release\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpf84\" (UniqueName: \"kubernetes.io/projected/33d2ad31-e97b-4649-8936-e50045eda195-kube-api-access-hpf84\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-systemd\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-socket-dir-parent\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-cni-netd\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-hostroot\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.066830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-socket-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.067160 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.066961 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.067057 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:53:35.567024013 +0000 UTC m=+3.112999378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bb70f31-0e60-414a-ac60-8535df8b1ed1-tmp-dir\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-run\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-host\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-netns\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-cni-multus\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-device-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-var-lib-kubelet\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-lib-modules\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69d66352-58de-4bf6-88d8-b5603ccbe8af-host\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-etc-kubernetes\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-cnibin\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.067947 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.067817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-os-release\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-kubernetes\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-sys\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-var-lib-kubelet\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-multus-certs\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9bb70f31-0e60-414a-ac60-8535df8b1ed1-hosts-file\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-cni-binary-copy\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-kubelet\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.068565 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.068915 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea1db72a-418a-4e3b-89f7-818e445eed4f-cni-binary-copy\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.068915 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.068915 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-multus-certs\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.068915 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-sys-fs\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.068915 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysconfig\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysctl-d\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-system-cni-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.068986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-ovnkube-config\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e61666b2-6500-4374-a876-375fa31848c7-ovn-node-metrics-cert\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-ovnkube-script-lib\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bfcf2bd4-6bd8-4072-a1ed-956e23cf9972-konnectivity-ca\") pod \"konnectivity-agent-q429h\" (UID: \"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972\") " pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-conf-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-conf-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.069111 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-system-cni-dir\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-node-log\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysctl-d\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-env-overrides\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpkb8\" (UniqueName: \"kubernetes.io/projected/e61666b2-6500-4374-a876-375fa31848c7-kube-api-access-gpkb8\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-sys-fs\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-modprobe-d\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-systemd\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-sysconfig\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bfcf2bd4-6bd8-4072-a1ed-956e23cf9972-agent-certs\") pod \"konnectivity-agent-q429h\" (UID: \"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972\") " pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-system-cni-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.069414 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcg6j\" (UniqueName: \"kubernetes.io/projected/617aedf6-9b98-49f4-9aaa-9499484faf5f-kube-api-access-zcg6j\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-system-cni-dir\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-systemd\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-systemd-units\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e107839a-5af3-4754-a936-a2da378bc464-etc-modprobe-d\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-os-release\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e107839a-5af3-4754-a936-a2da378bc464-etc-tuned\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea1db72a-418a-4e3b-89f7-818e445eed4f-os-release\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl74w\" (UniqueName: \"kubernetes.io/projected/e107839a-5af3-4754-a936-a2da378bc464-kube-api-access-cl74w\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33d2ad31-e97b-4649-8936-e50045eda195-cni-binary-copy\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/617aedf6-9b98-49f4-9aaa-9499484faf5f-host-slash\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8h8q\" (UniqueName: \"kubernetes.io/projected/6201ae7f-dbb7-4347-a698-89a65766225e-kube-api-access-r8h8q\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-k8s-cni-cncf-io\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.069833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69d66352-58de-4bf6-88d8-b5603ccbe8af-serviceca\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-run-netns\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069936 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-log-socket\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.069998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-cni-bin\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-registration-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-cni-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33d2ad31-e97b-4649-8936-e50045eda195-multus-daemon-config\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.070322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jfg\" (UniqueName: \"kubernetes.io/projected/69d66352-58de-4bf6-88d8-b5603ccbe8af-kube-api-access-r9jfg\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.070618 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-multus-cni-dir\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.070618 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33d2ad31-e97b-4649-8936-e50045eda195-host-run-k8s-cni-cncf-io\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.070618 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-registration-dir\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.070982 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.070950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33d2ad31-e97b-4649-8936-e50045eda195-cni-binary-copy\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.071100 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.071074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33d2ad31-e97b-4649-8936-e50045eda195-multus-daemon-config\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.071304 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.071288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69d66352-58de-4bf6-88d8-b5603ccbe8af-serviceca\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.073829 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.073788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e107839a-5af3-4754-a936-a2da378bc464-etc-tuned\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.073829 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.073822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e107839a-5af3-4754-a936-a2da378bc464-tmp\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.074046 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.074028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bfcf2bd4-6bd8-4072-a1ed-956e23cf9972-agent-certs\") pod \"konnectivity-agent-q429h\" (UID: \"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972\") " pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:35.085402 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.084740 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:35.085402 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.084759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlf8g\" (UniqueName: \"kubernetes.io/projected/9bb70f31-0e60-414a-ac60-8535df8b1ed1-kube-api-access-qlf8g\") pod \"node-resolver-svgk4\" (UID: \"9bb70f31-0e60-414a-ac60-8535df8b1ed1\") " pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.085402 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.084777 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:35.085402 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.084837 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:35.085402 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.084953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srv5c\" (UniqueName: \"kubernetes.io/projected/ea1db72a-418a-4e3b-89f7-818e445eed4f-kube-api-access-srv5c\") pod \"multus-additional-cni-plugins-59dv8\" (UID: \"ea1db72a-418a-4e3b-89f7-818e445eed4f\") " pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.085402 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.085081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:53:35.58505845 +0000 UTC m=+3.131033814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:35.086747 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.086646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpf84\" (UniqueName: \"kubernetes.io/projected/33d2ad31-e97b-4649-8936-e50045eda195-kube-api-access-hpf84\") pod \"multus-tk5p8\" (UID: \"33d2ad31-e97b-4649-8936-e50045eda195\") " pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.087177 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.087148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8h8q\" (UniqueName: \"kubernetes.io/projected/6201ae7f-dbb7-4347-a698-89a65766225e-kube-api-access-r8h8q\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:35.087271 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.087219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jfg\" (UniqueName: \"kubernetes.io/projected/69d66352-58de-4bf6-88d8-b5603ccbe8af-kube-api-access-r9jfg\") pod \"node-ca-vnvg4\" (UID: \"69d66352-58de-4bf6-88d8-b5603ccbe8af\") " pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.089223 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.089189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl74w\" (UniqueName: \"kubernetes.io/projected/e107839a-5af3-4754-a936-a2da378bc464-kube-api-access-cl74w\") pod \"tuned-vvgm9\" (UID: \"e107839a-5af3-4754-a936-a2da378bc464\") " pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.090060 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.089651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsq5j\" (UniqueName: \"kubernetes.io/projected/8bf24f62-2587-46b1-a36f-1dfa2a5835a0-kube-api-access-dsq5j\") pod \"aws-ebs-csi-driver-node-q27wl\" (UID: \"8bf24f62-2587-46b1-a36f-1dfa2a5835a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.132316 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.132287 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:35.171223 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-var-lib-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-systemd\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-cni-netd\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-var-lib-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-kubelet\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-systemd\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171398 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-ovnkube-config\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171824 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e61666b2-6500-4374-a876-375fa31848c7-ovn-node-metrics-cert\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171824 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-cni-netd\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171824 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171421 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-kubelet\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171824 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-ovnkube-script-lib\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171824 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-node-log\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.171824 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-env-overrides\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpkb8\" (UniqueName: \"kubernetes.io/projected/e61666b2-6500-4374-a876-375fa31848c7-kube-api-access-gpkb8\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcg6j\" (UniqueName: \"kubernetes.io/projected/617aedf6-9b98-49f4-9aaa-9499484faf5f-kube-api-access-zcg6j\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-systemd-units\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-node-log\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-ovnkube-config\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.171930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/617aedf6-9b98-49f4-9aaa-9499484faf5f-host-slash\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-run-netns\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172135 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-log-socket\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-cni-bin\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-ovnkube-script-lib\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/617aedf6-9b98-49f4-9aaa-9499484faf5f-iptables-alerter-script\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/617aedf6-9b98-49f4-9aaa-9499484faf5f-host-slash\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-slash\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-slash\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-ovn\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-run-netns\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-log-socket\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-etc-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-systemd-units\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-etc-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-ovn\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-host-cni-bin\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172411 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e61666b2-6500-4374-a876-375fa31848c7-run-openvswitch\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.172573 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e61666b2-6500-4374-a876-375fa31848c7-env-overrides\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.173290 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.172850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/617aedf6-9b98-49f4-9aaa-9499484faf5f-iptables-alerter-script\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.174044 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.174027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e61666b2-6500-4374-a876-375fa31848c7-ovn-node-metrics-cert\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.184362 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.184311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcg6j\" (UniqueName: \"kubernetes.io/projected/617aedf6-9b98-49f4-9aaa-9499484faf5f-kube-api-access-zcg6j\") pod \"iptables-alerter-n2jj4\" (UID: \"617aedf6-9b98-49f4-9aaa-9499484faf5f\") " pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.184362 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.184334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpkb8\" (UniqueName: \"kubernetes.io/projected/e61666b2-6500-4374-a876-375fa31848c7-kube-api-access-gpkb8\") pod \"ovnkube-node-ldzqx\" (UID: \"e61666b2-6500-4374-a876-375fa31848c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.250372 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.250338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tk5p8" Apr 23 17:53:35.260271 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.260247 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" Apr 23 17:53:35.268758 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.268738 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" Apr 23 17:53:35.275480 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.275456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vnvg4" Apr 23 17:53:35.282032 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.282006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-59dv8" Apr 23 17:53:35.290764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.290746 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:35.305362 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.305336 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-svgk4" Apr 23 17:53:35.313924 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.313907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n2jj4" Apr 23 17:53:35.321532 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.321509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:35.574913 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.574813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:35.575081 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.574997 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:35.575081 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.575057 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:53:36.575040488 +0000 UTC m=+4.121015839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:35.675976 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.675840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:35.676158 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.675990 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:35.676158 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.676012 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:35.676158 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.676026 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:35.676158 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:35.676085 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:53:36.676066659 +0000 UTC m=+4.222042024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:35.756698 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.756670 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb70f31_0e60_414a_ac60_8535df8b1ed1.slice/crio-d5dbaad2cf0b0f5603d20b494a678f629689cb798a95c9a896634fde0f45510f WatchSource:0}: Error finding container d5dbaad2cf0b0f5603d20b494a678f629689cb798a95c9a896634fde0f45510f: Status 404 returned error can't find the container with id d5dbaad2cf0b0f5603d20b494a678f629689cb798a95c9a896634fde0f45510f Apr 23 17:53:35.759128 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.759097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d66352_58de_4bf6_88d8_b5603ccbe8af.slice/crio-f3dfe7be51516ffaaeb537b74f80cc66eec87919baaa86f70f9a217e644c1cb3 WatchSource:0}: Error finding container f3dfe7be51516ffaaeb537b74f80cc66eec87919baaa86f70f9a217e644c1cb3: Status 404 returned error can't find the container with id f3dfe7be51516ffaaeb537b74f80cc66eec87919baaa86f70f9a217e644c1cb3 Apr 23 17:53:35.760237 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.760191 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61666b2_6500_4374_a876_375fa31848c7.slice/crio-22bbe56ef843b749166dfa9ddcbabfaf069562d8dc2d6ec3765b7ca8f18240bd WatchSource:0}: Error finding container 22bbe56ef843b749166dfa9ddcbabfaf069562d8dc2d6ec3765b7ca8f18240bd: Status 404 returned error can't find the container with id 22bbe56ef843b749166dfa9ddcbabfaf069562d8dc2d6ec3765b7ca8f18240bd Apr 23 17:53:35.763556 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.763530 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod617aedf6_9b98_49f4_9aaa_9499484faf5f.slice/crio-dc74131999d4749539d46d80c588dbe02ecccd6b013b9ad9c1d96f4a25fb5324 WatchSource:0}: Error finding container dc74131999d4749539d46d80c588dbe02ecccd6b013b9ad9c1d96f4a25fb5324: Status 404 returned error can't find the container with id dc74131999d4749539d46d80c588dbe02ecccd6b013b9ad9c1d96f4a25fb5324 Apr 23 17:53:35.765245 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.765206 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1db72a_418a_4e3b_89f7_818e445eed4f.slice/crio-e211428b85d6f68ed859a78c14b9843ef60cbe930ffb331e0f65ddcbe96bc749 WatchSource:0}: Error finding container e211428b85d6f68ed859a78c14b9843ef60cbe930ffb331e0f65ddcbe96bc749: Status 404 returned error can't find the container with id e211428b85d6f68ed859a78c14b9843ef60cbe930ffb331e0f65ddcbe96bc749 Apr 23 17:53:35.766647 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.766157 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bf24f62_2587_46b1_a36f_1dfa2a5835a0.slice/crio-07c174a1d99c21c96718c237408b9ceca7cc7e1f7d561efabe557b7465d01ce6 WatchSource:0}: Error finding container 07c174a1d99c21c96718c237408b9ceca7cc7e1f7d561efabe557b7465d01ce6: Status 404 returned error can't find the container with id 07c174a1d99c21c96718c237408b9ceca7cc7e1f7d561efabe557b7465d01ce6 Apr 23 17:53:35.768098 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.767920 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfcf2bd4_6bd8_4072_a1ed_956e23cf9972.slice/crio-979314fb203e8b0ed86dd20405438f4c645b531e74bc4b69e2c9fc29d9dc9e93 WatchSource:0}: Error finding container 979314fb203e8b0ed86dd20405438f4c645b531e74bc4b69e2c9fc29d9dc9e93: Status 404 returned error can't find the container with id 979314fb203e8b0ed86dd20405438f4c645b531e74bc4b69e2c9fc29d9dc9e93 Apr 23 17:53:35.769487 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.768746 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode107839a_5af3_4754_a936_a2da378bc464.slice/crio-3b0d1b3faa6af45f4fe97a974bd2ce14fc2906522e8fd8ba2db6d6ce77298ae1 WatchSource:0}: Error finding container 3b0d1b3faa6af45f4fe97a974bd2ce14fc2906522e8fd8ba2db6d6ce77298ae1: Status 404 returned error can't find the container with id 3b0d1b3faa6af45f4fe97a974bd2ce14fc2906522e8fd8ba2db6d6ce77298ae1 Apr 23 17:53:35.769655 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:53:35.769613 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d2ad31_e97b_4649_8936_e50045eda195.slice/crio-34a27c7c2640ae50434e0196db981e7647fa165f9a8016c2d3b18aea3bbbb057 WatchSource:0}: Error finding container 34a27c7c2640ae50434e0196db981e7647fa165f9a8016c2d3b18aea3bbbb057: Status 404 returned error can't find the container with id 34a27c7c2640ae50434e0196db981e7647fa165f9a8016c2d3b18aea3bbbb057 Apr 23 17:53:35.999698 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.999497 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:48:33 +0000 UTC" deadline="2028-01-29 08:29:35.149793084 +0000 UTC" Apr 23 17:53:35.999698 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:35.999691 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15494h35m59.150104965s" Apr 23 17:53:36.066901 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.066831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" event={"ID":"8bf24f62-2587-46b1-a36f-1dfa2a5835a0","Type":"ContainerStarted","Data":"07c174a1d99c21c96718c237408b9ceca7cc7e1f7d561efabe557b7465d01ce6"} Apr 23 17:53:36.068004 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.067961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n2jj4" event={"ID":"617aedf6-9b98-49f4-9aaa-9499484faf5f","Type":"ContainerStarted","Data":"dc74131999d4749539d46d80c588dbe02ecccd6b013b9ad9c1d96f4a25fb5324"} Apr 23 17:53:36.072009 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.071915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" event={"ID":"e107839a-5af3-4754-a936-a2da378bc464","Type":"ContainerStarted","Data":"3b0d1b3faa6af45f4fe97a974bd2ce14fc2906522e8fd8ba2db6d6ce77298ae1"} Apr 23 17:53:36.073615 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.073514 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"22bbe56ef843b749166dfa9ddcbabfaf069562d8dc2d6ec3765b7ca8f18240bd"} Apr 23 17:53:36.075941 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.075916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vnvg4" event={"ID":"69d66352-58de-4bf6-88d8-b5603ccbe8af","Type":"ContainerStarted","Data":"f3dfe7be51516ffaaeb537b74f80cc66eec87919baaa86f70f9a217e644c1cb3"} Apr 23 17:53:36.078176 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.078121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-svgk4" event={"ID":"9bb70f31-0e60-414a-ac60-8535df8b1ed1","Type":"ContainerStarted","Data":"d5dbaad2cf0b0f5603d20b494a678f629689cb798a95c9a896634fde0f45510f"} Apr 23 17:53:36.082435 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.082406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" event={"ID":"79e815642fa2b5d10caaca9635cb7db2","Type":"ContainerStarted","Data":"c7bed30ee5ee9d787fe12e1532be28f5789900b1c5e2cb4f4b81fe4128cc2008"} Apr 23 17:53:36.084639 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.084600 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tk5p8" event={"ID":"33d2ad31-e97b-4649-8936-e50045eda195","Type":"ContainerStarted","Data":"34a27c7c2640ae50434e0196db981e7647fa165f9a8016c2d3b18aea3bbbb057"} Apr 23 17:53:36.088036 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.088010 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q429h" event={"ID":"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972","Type":"ContainerStarted","Data":"979314fb203e8b0ed86dd20405438f4c645b531e74bc4b69e2c9fc29d9dc9e93"} Apr 23 17:53:36.091285 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.091260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerStarted","Data":"e211428b85d6f68ed859a78c14b9843ef60cbe930ffb331e0f65ddcbe96bc749"} Apr 23 17:53:36.586590 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.586030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:36.586590 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:36.586178 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:36.586590 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:36.586243 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:53:38.586224621 +0000 UTC m=+6.132199986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:36.686818 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:36.686781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:36.687017 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:36.686988 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:36.687017 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:36.687010 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:36.687017 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:36.687020 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:36.687178 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:36.687078 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:53:38.687056188 +0000 UTC m=+6.233031534 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:37.057391 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:37.057356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:37.057820 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:37.057538 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:37.058146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:37.058125 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:37.065794 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:37.062101 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:37.101358 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:37.101314 2575 generic.go:358] "Generic (PLEG): container finished" podID="7afc4ee7d1562afa8c49a2f94a2daa59" containerID="a04775e9c78d98b8e0b8c897617a6343ac1cd4f33cd4e5c08031184b268a8e18" exitCode=0 Apr 23 17:53:37.102301 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:37.102272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" event={"ID":"7afc4ee7d1562afa8c49a2f94a2daa59","Type":"ContainerDied","Data":"a04775e9c78d98b8e0b8c897617a6343ac1cd4f33cd4e5c08031184b268a8e18"} Apr 23 17:53:37.118992 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:37.118940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-131.ec2.internal" podStartSLOduration=3.1189163 podStartE2EDuration="3.1189163s" podCreationTimestamp="2026-04-23 17:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:53:36.097580536 +0000 UTC m=+3.643555904" watchObservedRunningTime="2026-04-23 17:53:37.1189163 +0000 UTC m=+4.664891668" Apr 23 17:53:38.120637 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:38.119975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" event={"ID":"7afc4ee7d1562afa8c49a2f94a2daa59","Type":"ContainerStarted","Data":"bda1d5a5b9597ef3534b7b132acfa849a346628f2ec6eac4d3e601c51ccc1138"} Apr 23 17:53:38.607530 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:38.606832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:38.607530 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:38.607046 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:38.607530 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:38.607111 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:53:42.60709093 +0000 UTC m=+10.153066279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:38.707763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:38.707721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:38.707968 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:38.707931 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:38.707968 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:38.707952 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:38.707968 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:38.707965 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:38.708126 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:38.708027 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:53:42.708006373 +0000 UTC m=+10.253981734 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:39.058172 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:39.058087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:39.058172 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:39.058088 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:39.058357 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:39.058215 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:39.058553 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:39.058524 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:41.057279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:41.057232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:41.057279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:41.057267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:41.057902 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:41.057361 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:41.057902 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:41.057513 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:42.641090 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:42.640520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:42.641090 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:42.640661 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:42.641090 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:42.640718 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:53:50.640702087 +0000 UTC m=+18.186677436 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:42.741284 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:42.741235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:42.741466 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:42.741434 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:42.741466 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:42.741458 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:42.741569 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:42.741471 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:42.741569 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:42.741534 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:53:50.741515094 +0000 UTC m=+18.287490465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:43.058125 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:43.057958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:43.058125 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:43.058057 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:43.058731 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:43.058536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:43.058731 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:43.058688 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:45.057716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:45.057680 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:45.058159 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:45.057693 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:45.058159 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:45.057795 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:45.058159 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:45.057919 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:47.057126 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:47.057087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:47.057126 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:47.057126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:47.057634 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:47.057229 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:47.057634 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:47.057356 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:49.056672 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:49.056629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:49.057160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:49.056629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:49.057160 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:49.056785 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:49.057160 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:49.056897 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:50.700750 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:50.700711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:50.701297 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:50.700838 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:50.701297 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:50.700921 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:54:06.700899386 +0000 UTC m=+34.246874745 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:50.801126 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:50.801085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:50.801281 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:50.801258 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:50.801281 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:50.801275 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:50.801384 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:50.801286 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:50.801384 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:50.801343 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:54:06.801327468 +0000 UTC m=+34.347302813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:51.057529 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:51.057451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:51.057658 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:51.057598 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:51.057724 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:51.057674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:51.057802 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:51.057783 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:53.058268 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:53.058230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:53.058727 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:53.058343 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:53.058727 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:53.058400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:53.058727 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:53.058520 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:54.153812 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.153357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" event={"ID":"8bf24f62-2587-46b1-a36f-1dfa2a5835a0","Type":"ContainerStarted","Data":"11bd2f5fd9f729720a4f84184e62b8a0044ee0b7cd9f50207f96667ebc2de23e"} Apr 23 17:53:54.154936 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.154867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" event={"ID":"e107839a-5af3-4754-a936-a2da378bc464","Type":"ContainerStarted","Data":"e8a26a4cd4ba75087375666c2ad10b762c912de4d7875e0baac7c3dc807cee98"} Apr 23 17:53:54.157945 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.157922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"1ab19ed6b7d3ca005eb88659074c2bdef4a8f378301eec97f30cf07dfe561074"} Apr 23 17:53:54.158063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.157954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"7b3a208995eac597454fd55be3164db1a405a700b23dece45b229294739949e7"} Apr 23 17:53:54.158063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.157968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"0b3af73dbc3ca2c39458a13b20f129b32081c3fb91914736f9cec042202fe2da"} Apr 23 17:53:54.158063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.157979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"e65e793122a7aa9ca818a8327aa23a5c4a2acc79c9a2bdd9052619f9d7eac017"} Apr 23 17:53:54.158063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.157990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"de2b7b2afc7f4d647fc80bc5ed86eb9d509a86c3abe658da8270802896b69ea6"} Apr 23 17:53:54.158063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.158001 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"506a0a51c7f396ffb0116fc1590336854dce41dc7b3da7fca5f4d5699cf353f0"} Apr 23 17:53:54.159333 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.159313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vnvg4" event={"ID":"69d66352-58de-4bf6-88d8-b5603ccbe8af","Type":"ContainerStarted","Data":"9e8566057654ac1d47fd90072bd63291e14dcf3d63948af9957cbedb3a6e1135"} Apr 23 17:53:54.160693 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.160667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-svgk4" event={"ID":"9bb70f31-0e60-414a-ac60-8535df8b1ed1","Type":"ContainerStarted","Data":"c00aa38f79ed9ed624c36615fe7aabc74c177eba7412e504038e86a123517c3b"} Apr 23 17:53:54.161964 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.161938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tk5p8" event={"ID":"33d2ad31-e97b-4649-8936-e50045eda195","Type":"ContainerStarted","Data":"09c24257ef557b8e67ce7b58470f3a97225a7b4346546c1bd5a3910ec80a16a4"} Apr 23 17:53:54.163369 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.163334 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q429h" event={"ID":"bfcf2bd4-6bd8-4072-a1ed-956e23cf9972","Type":"ContainerStarted","Data":"34c876e25c7946eba71fe15ed1d0774430fdd2458e0cd496071d11c521e7d963"} Apr 23 17:53:54.164739 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.164715 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea1db72a-418a-4e3b-89f7-818e445eed4f" containerID="d1b48ea1e685afc1bb408b016d42e9675b7ae5de43ec9bf48021c1cb67781e89" exitCode=0 Apr 23 17:53:54.164832 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.164752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerDied","Data":"d1b48ea1e685afc1bb408b016d42e9675b7ae5de43ec9bf48021c1cb67781e89"} Apr 23 17:53:54.174260 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.174225 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-131.ec2.internal" podStartSLOduration=20.17421262 podStartE2EDuration="20.17421262s" podCreationTimestamp="2026-04-23 17:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:53:38.136048825 +0000 UTC m=+5.682024193" watchObservedRunningTime="2026-04-23 17:53:54.17421262 +0000 UTC m=+21.720187988" Apr 23 17:53:54.174605 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.174573 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vvgm9" podStartSLOduration=3.714120097 podStartE2EDuration="21.174565564s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.77226635 +0000 UTC m=+3.318241695" lastFinishedPulling="2026-04-23 17:53:53.232711796 +0000 UTC m=+20.778687162" observedRunningTime="2026-04-23 17:53:54.173803539 +0000 UTC m=+21.719778906" watchObservedRunningTime="2026-04-23 17:53:54.174565564 +0000 UTC m=+21.720540931" Apr 23 17:53:54.217360 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.217302 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q429h" podStartSLOduration=3.7568332399999997 podStartE2EDuration="21.217285234s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.772316572 +0000 UTC m=+3.318291918" lastFinishedPulling="2026-04-23 17:53:53.232768549 +0000 UTC m=+20.778743912" observedRunningTime="2026-04-23 17:53:54.216918404 +0000 UTC m=+21.762893765" watchObservedRunningTime="2026-04-23 17:53:54.217285234 +0000 UTC m=+21.763260594" Apr 23 17:53:54.235392 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.235343 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tk5p8" podStartSLOduration=3.736769897 podStartE2EDuration="21.235329641s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.773113833 +0000 UTC m=+3.319089178" lastFinishedPulling="2026-04-23 17:53:53.271673561 +0000 UTC m=+20.817648922" observedRunningTime="2026-04-23 17:53:54.235121704 +0000 UTC m=+21.781097072" watchObservedRunningTime="2026-04-23 17:53:54.235329641 +0000 UTC m=+21.781305007" Apr 23 17:53:54.249847 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.249799 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vnvg4" podStartSLOduration=3.797366058 podStartE2EDuration="21.249787201s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.761806427 +0000 UTC m=+3.307781776" lastFinishedPulling="2026-04-23 17:53:53.214227574 +0000 UTC m=+20.760202919" observedRunningTime="2026-04-23 17:53:54.24966399 +0000 UTC m=+21.795639358" watchObservedRunningTime="2026-04-23 17:53:54.249787201 +0000 UTC m=+21.795762569" Apr 23 17:53:54.263027 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.262975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-svgk4" podStartSLOduration=3.788435702 podStartE2EDuration="21.262958779s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.75822214 +0000 UTC m=+3.304197486" lastFinishedPulling="2026-04-23 17:53:53.232745218 +0000 UTC m=+20.778720563" observedRunningTime="2026-04-23 17:53:54.26287507 +0000 UTC m=+21.808850449" watchObservedRunningTime="2026-04-23 17:53:54.262958779 +0000 UTC m=+21.808934148" Apr 23 17:53:54.432812 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.432790 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:53:54.907334 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.907301 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:54.908160 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:54.908134 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:55.026956 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.026692 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:53:54.43280821Z","UUID":"bec47ec7-e11e-4bd6-a59e-5e52ad52e93b","Handler":null,"Name":"","Endpoint":""} Apr 23 17:53:55.028454 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.028429 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:53:55.028585 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.028471 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:53:55.059145 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.059119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:55.059145 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.059141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:55.059365 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:55.059219 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:55.059424 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:55.059374 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:55.168757 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.168721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" event={"ID":"8bf24f62-2587-46b1-a36f-1dfa2a5835a0","Type":"ContainerStarted","Data":"66c082bb188835f2601459d0ca01081b9d3cf33969177a78eeec7a9e20ecefd9"} Apr 23 17:53:55.170156 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:55.170127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n2jj4" event={"ID":"617aedf6-9b98-49f4-9aaa-9499484faf5f","Type":"ContainerStarted","Data":"195a4e0c9359f436e3af11e2495923435ba20e835a1950f5533c3c1111cff0b3"} Apr 23 17:53:56.174645 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.174559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" event={"ID":"8bf24f62-2587-46b1-a36f-1dfa2a5835a0","Type":"ContainerStarted","Data":"c39273eb482a310e91597b51baeafb88f31357cdb2e78369640ac02cf15d770f"} Apr 23 17:53:56.178049 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.178023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"62ad3069cd81cbd8b35ea8cd93ca84dfe859139abf2d10e377247d95c991aa00"} Apr 23 17:53:56.178170 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.178085 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 17:53:56.199212 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.199175 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q27wl" podStartSLOduration=3.794453871 podStartE2EDuration="23.199160646s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.769050703 +0000 UTC m=+3.315026059" lastFinishedPulling="2026-04-23 17:53:55.173757476 +0000 UTC m=+22.719732834" observedRunningTime="2026-04-23 17:53:56.19878671 +0000 UTC m=+23.744762077" watchObservedRunningTime="2026-04-23 17:53:56.199160646 +0000 UTC m=+23.745136010" Apr 23 17:53:56.199792 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.199759 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n2jj4" podStartSLOduration=5.732765363 podStartE2EDuration="23.199749469s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.765752989 +0000 UTC m=+3.311728348" lastFinishedPulling="2026-04-23 17:53:53.232737106 +0000 UTC m=+20.778712454" observedRunningTime="2026-04-23 17:53:55.187124555 +0000 UTC m=+22.733099923" watchObservedRunningTime="2026-04-23 17:53:56.199749469 +0000 UTC m=+23.745724836" Apr 23 17:53:56.359567 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.359532 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:56.360285 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:56.360243 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q429h" Apr 23 17:53:57.057456 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:57.057421 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:57.057642 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:57.057421 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:57.057642 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:57.057589 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:57.057755 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:57.057729 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:59.057735 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.057562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:53:59.058456 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.057586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:53:59.058456 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:59.057815 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:53:59.058456 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:53:59.057935 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:53:59.184097 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.184063 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea1db72a-418a-4e3b-89f7-818e445eed4f" containerID="471517d83719c83aa553a1a7d7af2bfd6c173f4d40d7d5f87bef76124e93a923" exitCode=0 Apr 23 17:53:59.184268 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.184143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerDied","Data":"471517d83719c83aa553a1a7d7af2bfd6c173f4d40d7d5f87bef76124e93a923"} Apr 23 17:53:59.187488 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.187459 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" event={"ID":"e61666b2-6500-4374-a876-375fa31848c7","Type":"ContainerStarted","Data":"e2fa4fc7e471d671ebc4f9f18bbd06a57e02d6630cb6ed7c1d657f65c2e1210c"} Apr 23 17:53:59.187836 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.187817 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:59.187836 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.187841 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:59.188003 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.187852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:59.204293 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.204270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:59.205719 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.205698 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:53:59.244443 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:53:59.244395 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" podStartSLOduration=8.702386274 podStartE2EDuration="26.244382207s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.763240628 +0000 UTC m=+3.309215977" lastFinishedPulling="2026-04-23 17:53:53.305236549 +0000 UTC m=+20.851211910" observedRunningTime="2026-04-23 17:53:59.243955032 +0000 UTC m=+26.789930398" watchObservedRunningTime="2026-04-23 17:53:59.244382207 +0000 UTC m=+26.790357574" Apr 23 17:54:00.437638 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:00.437456 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bc6lb"] Apr 23 17:54:00.438035 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:00.437759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:00.438035 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:00.437938 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:54:00.440382 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:00.440361 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6lhps"] Apr 23 17:54:00.440471 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:00.440457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:00.440548 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:00.440532 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:54:01.193589 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:01.193558 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea1db72a-418a-4e3b-89f7-818e445eed4f" containerID="eedb620f578488739e2b5f12edabbf7c8777ba3e7ce921fe4c2787966a66075e" exitCode=0 Apr 23 17:54:01.193763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:01.193645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerDied","Data":"eedb620f578488739e2b5f12edabbf7c8777ba3e7ce921fe4c2787966a66075e"} Apr 23 17:54:02.057607 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:02.057573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:02.058012 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:02.057693 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:54:02.058012 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:02.057753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:02.058012 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:02.057860 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:54:03.199561 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:03.199529 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea1db72a-418a-4e3b-89f7-818e445eed4f" containerID="8af404f292d60049c8f1b58961ff71e3ccaa2f0c2629a42793b53a3a2ca1b9f2" exitCode=0 Apr 23 17:54:03.199990 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:03.199585 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerDied","Data":"8af404f292d60049c8f1b58961ff71e3ccaa2f0c2629a42793b53a3a2ca1b9f2"} Apr 23 17:54:04.056655 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:04.056622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:04.056853 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:04.056633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:04.056853 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:04.056750 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:54:04.056853 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:04.056844 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:54:06.056999 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:06.056771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:06.057578 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:06.056771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:06.057578 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.057111 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6lhps" podUID="6201ae7f-dbb7-4347-a698-89a65766225e" Apr 23 17:54:06.057578 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.057177 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc6lb" podUID="51e88714-7c29-490d-b1b7-79b96331f10c" Apr 23 17:54:06.713702 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:06.713665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:06.713865 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.713812 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:06.713925 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.713874 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:54:38.713859801 +0000 UTC m=+66.259835145 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:54:06.814329 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:06.814293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:06.814522 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.814479 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:54:06.814522 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.814506 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:54:06.814522 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.814519 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4bzgg for pod openshift-network-diagnostics/network-check-target-bc6lb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:06.814651 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:06.814582 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg podName:51e88714-7c29-490d-b1b7-79b96331f10c nodeName:}" failed. No retries permitted until 2026-04-23 17:54:38.81456747 +0000 UTC m=+66.360542815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bzgg" (UniqueName: "kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg") pod "network-check-target-bc6lb" (UID: "51e88714-7c29-490d-b1b7-79b96331f10c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:54:07.281247 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.281213 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-131.ec2.internal" event="NodeReady" Apr 23 17:54:07.281680 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.281366 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:54:07.339284 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.339245 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-swk4j"] Apr 23 17:54:07.358839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.358810 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tld98"] Apr 23 17:54:07.371542 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.371512 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-swk4j"] Apr 23 17:54:07.371542 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.371544 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tld98"] Apr 23 17:54:07.371769 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.371613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.371769 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.371646 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:07.375095 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.374955 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:54:07.375243 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.375134 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:54:07.375322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.375283 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jg9fn\"" Apr 23 17:54:07.375383 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.375345 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:54:07.375575 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.375556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nr8tq\"" Apr 23 17:54:07.375808 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.375788 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:54:07.375974 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.375958 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:54:07.518544 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.518509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgww8\" (UniqueName: \"kubernetes.io/projected/a57787a9-765e-4e66-b8bd-e8c50eaf8977-kube-api-access-bgww8\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.518544 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.518545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgd7q\" (UniqueName: \"kubernetes.io/projected/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-kube-api-access-zgd7q\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:07.518748 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.518610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.518748 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.518632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a57787a9-765e-4e66-b8bd-e8c50eaf8977-config-volume\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.518748 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.518657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:07.518748 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.518683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a57787a9-765e-4e66-b8bd-e8c50eaf8977-tmp-dir\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.619936 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.619827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgww8\" (UniqueName: \"kubernetes.io/projected/a57787a9-765e-4e66-b8bd-e8c50eaf8977-kube-api-access-bgww8\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.619936 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.619905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgd7q\" (UniqueName: \"kubernetes.io/projected/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-kube-api-access-zgd7q\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:07.620153 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.619953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.620153 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.619981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a57787a9-765e-4e66-b8bd-e8c50eaf8977-config-volume\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.620153 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.620009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:07.620153 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.620041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a57787a9-765e-4e66-b8bd-e8c50eaf8977-tmp-dir\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.620320 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:07.620249 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:07.620357 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:07.620319 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:08.120297118 +0000 UTC m=+35.666272465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:07.620407 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:07.620367 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:07.620439 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.620420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a57787a9-765e-4e66-b8bd-e8c50eaf8977-tmp-dir\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.620471 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:07.620426 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:08.120408554 +0000 UTC m=+35.666383916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:07.620676 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.620657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a57787a9-765e-4e66-b8bd-e8c50eaf8977-config-volume\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.639299 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.639268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgww8\" (UniqueName: \"kubernetes.io/projected/a57787a9-765e-4e66-b8bd-e8c50eaf8977-kube-api-access-bgww8\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:07.639299 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:07.639283 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgd7q\" (UniqueName: \"kubernetes.io/projected/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-kube-api-access-zgd7q\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:08.057963 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.057163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:08.057963 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.057500 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:08.060005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.059981 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:54:08.060005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.059986 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:54:08.060477 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.060458 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6fds\"" Apr 23 17:54:08.060598 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.060578 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x582n\"" Apr 23 17:54:08.060858 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.060839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:54:08.125043 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.125005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:08.125208 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:08.125072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:08.125208 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:08.125145 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:08.125208 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:08.125202 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:08.125353 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:08.125229 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:09.125207403 +0000 UTC m=+36.671182759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:08.125353 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:08.125251 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:09.125241155 +0000 UTC m=+36.671216506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:09.133293 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:09.133259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:09.133293 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:09.133301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:09.133763 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:09.133403 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:09.133763 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:09.133406 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:09.133763 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:09.133457 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:11.133444334 +0000 UTC m=+38.679419679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:09.133763 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:09.133470 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:11.133463906 +0000 UTC m=+38.679439250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:10.216078 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:10.216047 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea1db72a-418a-4e3b-89f7-818e445eed4f" containerID="3cfac1babc11b73a745b81c05b62fa321da732557a0d5a00591f79bd09da51de" exitCode=0 Apr 23 17:54:10.216757 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:10.216107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerDied","Data":"3cfac1babc11b73a745b81c05b62fa321da732557a0d5a00591f79bd09da51de"} Apr 23 17:54:11.147115 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:11.147077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:11.147115 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:11.147117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:11.147384 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:11.147211 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:11.147384 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:11.147216 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:11.147384 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:11.147304 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:15.147286708 +0000 UTC m=+42.693262066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:11.147384 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:11.147318 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:15.147312231 +0000 UTC m=+42.693287577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:11.221030 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:11.220999 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea1db72a-418a-4e3b-89f7-818e445eed4f" containerID="54bf54a856188185c5cad7c2a4c3a950402763cdb4806cd8865d7236635d85a4" exitCode=0 Apr 23 17:54:11.221383 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:11.221051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerDied","Data":"54bf54a856188185c5cad7c2a4c3a950402763cdb4806cd8865d7236635d85a4"} Apr 23 17:54:12.227984 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:12.227940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-59dv8" event={"ID":"ea1db72a-418a-4e3b-89f7-818e445eed4f","Type":"ContainerStarted","Data":"6db118cef01de93c7d18ff5caa84abebc2a1f51ccf7063018a755f46ee31bde6"} Apr 23 17:54:12.253295 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:12.253235 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-59dv8" podStartSLOduration=5.853679939 podStartE2EDuration="39.253207434s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:53:35.768608747 +0000 UTC m=+3.314584106" lastFinishedPulling="2026-04-23 17:54:09.168136256 +0000 UTC m=+36.714111601" observedRunningTime="2026-04-23 17:54:12.252965259 +0000 UTC m=+39.798940627" watchObservedRunningTime="2026-04-23 17:54:12.253207434 +0000 UTC m=+39.799182799" Apr 23 17:54:15.171517 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:15.171477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:15.171517 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:15.171521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:15.171970 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:15.171661 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:15.171970 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:15.171686 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:15.171970 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:15.171715 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:23.171699917 +0000 UTC m=+50.717675261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:15.171970 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:15.171762 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:23.171743012 +0000 UTC m=+50.717718360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:23.223747 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:23.223685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:23.224237 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:23.223788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:23.224237 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:23.223870 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:23.224237 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:23.223980 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:39.223962748 +0000 UTC m=+66.769938092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:23.224237 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:23.223870 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:23.224237 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:23.224023 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:39.22401221 +0000 UTC m=+66.769987556 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:26.858911 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.858849 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb"] Apr 23 17:54:26.892247 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.892211 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27"] Apr 23 17:54:26.892400 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.892369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:26.894448 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.894432 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:54:26.895028 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.895002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 17:54:26.895028 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.895011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:54:26.895162 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.895080 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:54:26.904322 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.904303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb"] Apr 23 17:54:26.904387 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.904326 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27"] Apr 23 17:54:26.904422 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.904394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:26.906313 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.906260 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 17:54:26.906313 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:26.906278 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2zfs5\"" Apr 23 17:54:27.050905 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.050856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-tmp\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.051101 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.050924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcm5d\" (UniqueName: \"kubernetes.io/projected/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-kube-api-access-jcm5d\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.051101 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.050981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrb75\" (UniqueName: \"kubernetes.io/projected/4f026ebf-8e71-48ec-a12a-8f17aaa04805-kube-api-access-zrb75\") pod \"managed-serviceaccount-addon-agent-6546c5b589-88n27\" (UID: \"4f026ebf-8e71-48ec-a12a-8f17aaa04805\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.051101 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.051011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.051101 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.051035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4f026ebf-8e71-48ec-a12a-8f17aaa04805-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6546c5b589-88n27\" (UID: \"4f026ebf-8e71-48ec-a12a-8f17aaa04805\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.152182 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.152094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-tmp\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.152182 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.152131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcm5d\" (UniqueName: \"kubernetes.io/projected/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-kube-api-access-jcm5d\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.152182 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.152166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrb75\" (UniqueName: \"kubernetes.io/projected/4f026ebf-8e71-48ec-a12a-8f17aaa04805-kube-api-access-zrb75\") pod \"managed-serviceaccount-addon-agent-6546c5b589-88n27\" (UID: \"4f026ebf-8e71-48ec-a12a-8f17aaa04805\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.152182 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.152185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.152494 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.152224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4f026ebf-8e71-48ec-a12a-8f17aaa04805-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6546c5b589-88n27\" (UID: \"4f026ebf-8e71-48ec-a12a-8f17aaa04805\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.152591 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.152569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-tmp\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.155117 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.155090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.155240 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.155218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4f026ebf-8e71-48ec-a12a-8f17aaa04805-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6546c5b589-88n27\" (UID: \"4f026ebf-8e71-48ec-a12a-8f17aaa04805\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.160007 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.159986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcm5d\" (UniqueName: \"kubernetes.io/projected/fdcc5930-3a9b-470b-9055-c9e01a08c0b8-kube-api-access-jcm5d\") pod \"klusterlet-addon-workmgr-7d9df5ff9d-bfrgb\" (UID: \"fdcc5930-3a9b-470b-9055-c9e01a08c0b8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.160265 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.160244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrb75\" (UniqueName: \"kubernetes.io/projected/4f026ebf-8e71-48ec-a12a-8f17aaa04805-kube-api-access-zrb75\") pod \"managed-serviceaccount-addon-agent-6546c5b589-88n27\" (UID: \"4f026ebf-8e71-48ec-a12a-8f17aaa04805\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.204316 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.204280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:27.220001 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.219975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" Apr 23 17:54:27.377587 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.377549 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27"] Apr 23 17:54:27.381801 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:27.381778 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb"] Apr 23 17:54:27.382320 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:27.382294 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f026ebf_8e71_48ec_a12a_8f17aaa04805.slice/crio-7ba2cd05c02b5386b941281f09b7c203b6eeb02ab72142994731ed1c5efb9bd4 WatchSource:0}: Error finding container 7ba2cd05c02b5386b941281f09b7c203b6eeb02ab72142994731ed1c5efb9bd4: Status 404 returned error can't find the container with id 7ba2cd05c02b5386b941281f09b7c203b6eeb02ab72142994731ed1c5efb9bd4 Apr 23 17:54:27.385021 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:27.384999 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdcc5930_3a9b_470b_9055_c9e01a08c0b8.slice/crio-5432861406ff147bd59d66cca27e87bd37c9a27cbd492384413b77f008d045fc WatchSource:0}: Error finding container 5432861406ff147bd59d66cca27e87bd37c9a27cbd492384413b77f008d045fc: Status 404 returned error can't find the container with id 5432861406ff147bd59d66cca27e87bd37c9a27cbd492384413b77f008d045fc Apr 23 17:54:28.258302 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:28.258266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" event={"ID":"fdcc5930-3a9b-470b-9055-c9e01a08c0b8","Type":"ContainerStarted","Data":"5432861406ff147bd59d66cca27e87bd37c9a27cbd492384413b77f008d045fc"} Apr 23 17:54:28.259301 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:28.259277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" event={"ID":"4f026ebf-8e71-48ec-a12a-8f17aaa04805","Type":"ContainerStarted","Data":"7ba2cd05c02b5386b941281f09b7c203b6eeb02ab72142994731ed1c5efb9bd4"} Apr 23 17:54:31.213951 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:31.213905 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldzqx" Apr 23 17:54:32.270054 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:32.269956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" event={"ID":"4f026ebf-8e71-48ec-a12a-8f17aaa04805","Type":"ContainerStarted","Data":"e057a3057738dc0eac551846cc537bb467d85d9366921d1154649573c1423c32"} Apr 23 17:54:32.271197 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:32.271168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" event={"ID":"fdcc5930-3a9b-470b-9055-c9e01a08c0b8","Type":"ContainerStarted","Data":"0674fd12662b1a0c7332fe428514578dd644b6bf32ba85f7f363556a8e12a1b1"} Apr 23 17:54:32.271410 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:32.271392 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:32.272939 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:32.272919 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" Apr 23 17:54:32.287895 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:32.287827 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6546c5b589-88n27" podStartSLOduration=1.737849727 podStartE2EDuration="6.287811778s" podCreationTimestamp="2026-04-23 17:54:26 +0000 UTC" firstStartedPulling="2026-04-23 17:54:27.384438242 +0000 UTC m=+54.930413590" lastFinishedPulling="2026-04-23 17:54:31.934400289 +0000 UTC m=+59.480375641" observedRunningTime="2026-04-23 17:54:32.28720349 +0000 UTC m=+59.833178868" watchObservedRunningTime="2026-04-23 17:54:32.287811778 +0000 UTC m=+59.833787146" Apr 23 17:54:32.303449 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:32.303411 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d9df5ff9d-bfrgb" podStartSLOduration=1.744094064 podStartE2EDuration="6.30339966s" podCreationTimestamp="2026-04-23 17:54:26 +0000 UTC" firstStartedPulling="2026-04-23 17:54:27.386497458 +0000 UTC m=+54.932472807" lastFinishedPulling="2026-04-23 17:54:31.945803057 +0000 UTC m=+59.491778403" observedRunningTime="2026-04-23 17:54:32.303132911 +0000 UTC m=+59.849108278" watchObservedRunningTime="2026-04-23 17:54:32.30339966 +0000 UTC m=+59.849375029" Apr 23 17:54:38.738763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.738705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:54:38.740954 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.740934 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:54:38.749831 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:38.749804 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:54:38.749949 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:38.749877 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs podName:6201ae7f-dbb7-4347-a698-89a65766225e nodeName:}" failed. No retries permitted until 2026-04-23 17:55:42.749854022 +0000 UTC m=+130.295829383 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs") pod "network-metrics-daemon-6lhps" (UID: "6201ae7f-dbb7-4347-a698-89a65766225e") : secret "metrics-daemon-secret" not found Apr 23 17:54:38.839131 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.839089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:38.841986 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.841963 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:54:38.851739 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.851720 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:54:38.863233 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.863154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzgg\" (UniqueName: \"kubernetes.io/projected/51e88714-7c29-490d-b1b7-79b96331f10c-kube-api-access-4bzgg\") pod \"network-check-target-bc6lb\" (UID: \"51e88714-7c29-490d-b1b7-79b96331f10c\") " pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:38.978063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.978027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6fds\"" Apr 23 17:54:38.986051 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:38.986026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:39.117657 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:39.117623 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bc6lb"] Apr 23 17:54:39.121267 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:39.121241 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e88714_7c29_490d_b1b7_79b96331f10c.slice/crio-ead5426af8e9572cf0e6135c1eb39b63d4c10a2a168bfcacb14557c1ce04d8a1 WatchSource:0}: Error finding container ead5426af8e9572cf0e6135c1eb39b63d4c10a2a168bfcacb14557c1ce04d8a1: Status 404 returned error can't find the container with id ead5426af8e9572cf0e6135c1eb39b63d4c10a2a168bfcacb14557c1ce04d8a1 Apr 23 17:54:39.242898 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:39.242840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:54:39.243095 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:39.242960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:54:39.243095 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:39.242995 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:39.243095 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:39.243068 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:39.243223 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:39.243072 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert podName:b6a9e60b-54f2-4766-83e3-2d825df0b4f0 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:11.243057312 +0000 UTC m=+98.789032658 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert") pod "ingress-canary-tld98" (UID: "b6a9e60b-54f2-4766-83e3-2d825df0b4f0") : secret "canary-serving-cert" not found Apr 23 17:54:39.243223 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:39.243128 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls podName:a57787a9-765e-4e66-b8bd-e8c50eaf8977 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:11.243113107 +0000 UTC m=+98.789088452 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls") pod "dns-default-swk4j" (UID: "a57787a9-765e-4e66-b8bd-e8c50eaf8977") : secret "dns-default-metrics-tls" not found Apr 23 17:54:39.285386 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:39.285309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bc6lb" event={"ID":"51e88714-7c29-490d-b1b7-79b96331f10c","Type":"ContainerStarted","Data":"ead5426af8e9572cf0e6135c1eb39b63d4c10a2a168bfcacb14557c1ce04d8a1"} Apr 23 17:54:42.293004 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:42.292964 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bc6lb" event={"ID":"51e88714-7c29-490d-b1b7-79b96331f10c","Type":"ContainerStarted","Data":"f1d75009cd6272dbedbda171cb5fe140ad6dc08e68f731129f7b9c58f2c2f391"} Apr 23 17:54:42.293448 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:42.293188 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:54:42.310567 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:42.310517 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bc6lb" podStartSLOduration=66.740380465 podStartE2EDuration="1m9.310501804s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:54:39.123162475 +0000 UTC m=+66.669137820" lastFinishedPulling="2026-04-23 17:54:41.693283813 +0000 UTC m=+69.239259159" observedRunningTime="2026-04-23 17:54:42.309753629 +0000 UTC m=+69.855728995" watchObservedRunningTime="2026-04-23 17:54:42.310501804 +0000 UTC m=+69.856477204" Apr 23 17:54:43.412648 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.412609 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs"] Apr 23 17:54:43.437628 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.437597 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zmmnp"] Apr 23 17:54:43.437790 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.437773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.440811 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.440785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 17:54:43.440961 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.440786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 17:54:43.442319 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.442300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.442898 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.442860 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-r9bcf\"" Apr 23 17:54:43.443005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.442916 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.450321 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.450301 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dznpd"] Apr 23 17:54:43.450474 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.450459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.452443 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.452425 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.452794 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.452776 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 17:54:43.453518 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.453496 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 17:54:43.453674 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.453545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.455508 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.455133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-tp6c2\"" Apr 23 17:54:43.461735 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.461717 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 17:54:43.472021 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.471998 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc"] Apr 23 17:54:43.472289 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.472267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.474943 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.474919 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-5dxtq\"" Apr 23 17:54:43.474943 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.474933 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 17:54:43.475088 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.474942 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.475088 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.474963 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 17:54:43.475088 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.474990 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.480957 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.480941 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 17:54:43.484558 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.484541 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79996f959d-wvgtd"] Apr 23 17:54:43.484685 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.484670 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.486764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.486749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.486764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.486757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 17:54:43.486764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.486763 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-c555q\"" Apr 23 17:54:43.486977 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.486753 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.486977 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.486847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 17:54:43.505945 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.505914 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs"] Apr 23 17:54:43.506063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.505951 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zmmnp"] Apr 23 17:54:43.506063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.505966 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dznpd"] Apr 23 17:54:43.506063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.505979 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc"] Apr 23 17:54:43.506063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.505989 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79996f959d-wvgtd"] Apr 23 17:54:43.506063 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.505998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.507985 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.507957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 17:54:43.508096 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.508065 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 17:54:43.508096 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.508080 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.508368 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.508351 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 17:54:43.508463 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.508384 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 17:54:43.508708 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.508689 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nhkvl\"" Apr 23 17:54:43.508808 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.508794 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.526036 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.526013 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m"] Apr 23 17:54:43.553621 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.553589 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-669c6dbffc-ntp4x"] Apr 23 17:54:43.553744 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.553657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.558309 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.558291 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.558617 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.558599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-gtdmn\"" Apr 23 17:54:43.558708 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.558635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.559242 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.559228 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 17:54:43.560744 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.560726 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 17:54:43.572548 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.572521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m"] Apr 23 17:54:43.572668 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.572631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.576327 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ae4d89-cf03-479c-935d-c2b46bb0082b-config\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.576444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffa5d0d2-4b4f-4472-b414-6aefb709735c-service-ca-bundle\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.576444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576365 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fsvg\" (UniqueName: \"kubernetes.io/projected/50ae4d89-cf03-479c-935d-c2b46bb0082b-kube-api-access-4fsvg\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.576554 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffa5d0d2-4b4f-4472-b414-6aefb709735c-tmp\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.576554 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576504 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffa5d0d2-4b4f-4472-b414-6aefb709735c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.576554 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f277ff-c690-4837-8465-5d845f4c966b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.576698 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa5d0d2-4b4f-4472-b414-6aefb709735c-serving-cert\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.576698 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50ae4d89-cf03-479c-935d-c2b46bb0082b-trusted-ca\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.576698 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgnd\" (UniqueName: \"kubernetes.io/projected/ffa5d0d2-4b4f-4472-b414-6aefb709735c-kube-api-access-mrgnd\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.576698 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w2bl\" (UniqueName: \"kubernetes.io/projected/e7f277ff-c690-4837-8465-5d845f4c966b-kube-api-access-7w2bl\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.576894 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ae4d89-cf03-479c-935d-c2b46bb0082b-serving-cert\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.576894 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f277ff-c690-4837-8465-5d845f4c966b-config\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.576894 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.576778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ffa5d0d2-4b4f-4472-b414-6aefb709735c-snapshots\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.577350 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.577332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:54:43.577547 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.577530 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:54:43.578806 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.578785 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ccrfl\"" Apr 23 17:54:43.580002 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.579985 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:54:43.585422 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.585397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:54:43.589346 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.589326 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-669c6dbffc-ntp4x"] Apr 23 17:54:43.677385 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa5d0d2-4b4f-4472-b414-6aefb709735c-serving-cert\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.677545 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgnd\" (UniqueName: \"kubernetes.io/projected/ffa5d0d2-4b4f-4472-b414-6aefb709735c-kube-api-access-mrgnd\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.677545 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.677545 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677455 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrgwr\" (UniqueName: \"kubernetes.io/projected/cbf092c0-733e-4d6b-a240-9d95ac93022a-kube-api-access-rrgwr\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.677545 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4825f9f-c326-4675-901b-635a9bb75ddc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.677545 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-image-registry-private-configuration\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.677545 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f277ff-c690-4837-8465-5d845f4c966b-config\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ae4d89-cf03-479c-935d-c2b46bb0082b-config\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-bound-sa-token\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-trusted-ca\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-default-certificate\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d439c95a-c193-4b86-a761-5d34ccc0e57d-ca-trust-extracted\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f277ff-c690-4837-8465-5d845f4c966b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.677839 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50ae4d89-cf03-479c-935d-c2b46bb0082b-trusted-ca\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w2bl\" (UniqueName: \"kubernetes.io/projected/e7f277ff-c690-4837-8465-5d845f4c966b-kube-api-access-7w2bl\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.677967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffa5d0d2-4b4f-4472-b414-6aefb709735c-service-ca-bundle\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ae4d89-cf03-479c-935d-c2b46bb0082b-serving-cert\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ffa5d0d2-4b4f-4472-b414-6aefb709735c-snapshots\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cbf092c0-733e-4d6b-a240-9d95ac93022a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c792\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-kube-api-access-2c792\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f277ff-c690-4837-8465-5d845f4c966b-config\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fsvg\" (UniqueName: \"kubernetes.io/projected/50ae4d89-cf03-479c-935d-c2b46bb0082b-kube-api-access-4fsvg\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4825f9f-c326-4675-901b-635a9bb75ddc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpd8m\" (UniqueName: \"kubernetes.io/projected/9ae6b02c-1ffb-4822-950c-d4782e47731b-kube-api-access-qpd8m\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffa5d0d2-4b4f-4472-b414-6aefb709735c-tmp\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffa5d0d2-4b4f-4472-b414-6aefb709735c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.678275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-installation-pull-secrets\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.678918 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgx9g\" (UniqueName: \"kubernetes.io/projected/c4825f9f-c326-4675-901b-635a9bb75ddc-kube-api-access-pgx9g\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.678918 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ae4d89-cf03-479c-935d-c2b46bb0082b-config\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.678918 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-certificates\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.678918 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50ae4d89-cf03-479c-935d-c2b46bb0082b-trusted-ca\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.678918 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.678900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-stats-auth\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.679177 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.679018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffa5d0d2-4b4f-4472-b414-6aefb709735c-tmp\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.679177 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.679028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.679373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.679349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffa5d0d2-4b4f-4472-b414-6aefb709735c-service-ca-bundle\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.679455 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.679437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ffa5d0d2-4b4f-4472-b414-6aefb709735c-snapshots\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.679822 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.679801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffa5d0d2-4b4f-4472-b414-6aefb709735c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.680450 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.680424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f277ff-c690-4837-8465-5d845f4c966b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.680538 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.680502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa5d0d2-4b4f-4472-b414-6aefb709735c-serving-cert\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.680583 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.680568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ae4d89-cf03-479c-935d-c2b46bb0082b-serving-cert\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.685740 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.685717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgnd\" (UniqueName: \"kubernetes.io/projected/ffa5d0d2-4b4f-4472-b414-6aefb709735c-kube-api-access-mrgnd\") pod \"insights-operator-585dfdc468-zmmnp\" (UID: \"ffa5d0d2-4b4f-4472-b414-6aefb709735c\") " pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.688556 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.688538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w2bl\" (UniqueName: \"kubernetes.io/projected/e7f277ff-c690-4837-8465-5d845f4c966b-kube-api-access-7w2bl\") pod \"service-ca-operator-d6fc45fc5-69wxs\" (UID: \"e7f277ff-c690-4837-8465-5d845f4c966b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.689072 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.689054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fsvg\" (UniqueName: \"kubernetes.io/projected/50ae4d89-cf03-479c-935d-c2b46bb0082b-kube-api-access-4fsvg\") pod \"console-operator-9d4b6777b-dznpd\" (UID: \"50ae4d89-cf03-479c-935d-c2b46bb0082b\") " pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.747916 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.747870 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" Apr 23 17:54:43.762666 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.762637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" Apr 23 17:54:43.779716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.779691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cbf092c0-733e-4d6b-a240-9d95ac93022a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.779851 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.779730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c792\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-kube-api-access-2c792\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.779925 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.779845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4825f9f-c326-4675-901b-635a9bb75ddc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.779925 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.779910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpd8m\" (UniqueName: \"kubernetes.io/projected/9ae6b02c-1ffb-4822-950c-d4782e47731b-kube-api-access-qpd8m\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.780037 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.779946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-installation-pull-secrets\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780037 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.779977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgx9g\" (UniqueName: \"kubernetes.io/projected/c4825f9f-c326-4675-901b-635a9bb75ddc-kube-api-access-pgx9g\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.780037 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-certificates\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780037 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-stats-auth\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.780226 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.780226 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.780226 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrgwr\" (UniqueName: \"kubernetes.io/projected/cbf092c0-733e-4d6b-a240-9d95ac93022a-kube-api-access-rrgwr\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.780226 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4825f9f-c326-4675-901b-635a9bb75ddc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.780226 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-image-registry-private-configuration\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-bound-sa-token\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-trusted-ca\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-default-certificate\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.780478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d439c95a-c193-4b86-a761-5d34ccc0e57d-ca-trust-extracted\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.780478 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.780428 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 17:54:43.780759 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.780507 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:44.280483547 +0000 UTC m=+71.826458892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : secret "router-metrics-certs-default" not found Apr 23 17:54:43.780759 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.780554 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:44.280535822 +0000 UTC m=+71.826511178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : configmap references non-existent config key: service-ca.crt Apr 23 17:54:43.780759 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.780937 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4825f9f-c326-4675-901b-635a9bb75ddc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.780937 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.780920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cbf092c0-733e-4d6b-a240-9d95ac93022a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.781041 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.780953 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:43.781041 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.781014 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls podName:cbf092c0-733e-4d6b-a240-9d95ac93022a nodeName:}" failed. No retries permitted until 2026-04-23 17:54:44.280998382 +0000 UTC m=+71.826973763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qrm2m" (UID: "cbf092c0-733e-4d6b-a240-9d95ac93022a") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:43.781436 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.781405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-certificates\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.782701 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.781860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d439c95a-c193-4b86-a761-5d34ccc0e57d-ca-trust-extracted\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.782701 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.782175 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:43.782701 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.782190 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-669c6dbffc-ntp4x: secret "image-registry-tls" not found Apr 23 17:54:43.782701 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:43.782264 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls podName:d439c95a-c193-4b86-a761-5d34ccc0e57d nodeName:}" failed. No retries permitted until 2026-04-23 17:54:44.2822474 +0000 UTC m=+71.828222745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls") pod "image-registry-669c6dbffc-ntp4x" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d") : secret "image-registry-tls" not found Apr 23 17:54:43.782701 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.782364 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:43.782701 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.782435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-trusted-ca\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.784077 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.784052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4825f9f-c326-4675-901b-635a9bb75ddc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.784362 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.784299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-default-certificate\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.784644 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.784619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-installation-pull-secrets\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.785218 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.785196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-image-registry-private-configuration\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.792085 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.791582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c792\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-kube-api-access-2c792\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.793710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.793677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-bound-sa-token\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:43.795256 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.795229 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgx9g\" (UniqueName: \"kubernetes.io/projected/c4825f9f-c326-4675-901b-635a9bb75ddc-kube-api-access-pgx9g\") pod \"kube-storage-version-migrator-operator-6769c5d45-k5xtc\" (UID: \"c4825f9f-c326-4675-901b-635a9bb75ddc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:43.795453 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.795436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrgwr\" (UniqueName: \"kubernetes.io/projected/cbf092c0-733e-4d6b-a240-9d95ac93022a-kube-api-access-rrgwr\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:43.795850 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.795835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-stats-auth\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.798119 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.798097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpd8m\" (UniqueName: \"kubernetes.io/projected/9ae6b02c-1ffb-4822-950c-d4782e47731b-kube-api-access-qpd8m\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:43.889548 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.889497 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs"] Apr 23 17:54:43.893296 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:43.893268 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f277ff_c690_4837_8465_5d845f4c966b.slice/crio-48ec35178a3dc104200290c11ba3099f7e40fd6f9df4dfeff49215af9a93f8fb WatchSource:0}: Error finding container 48ec35178a3dc104200290c11ba3099f7e40fd6f9df4dfeff49215af9a93f8fb: Status 404 returned error can't find the container with id 48ec35178a3dc104200290c11ba3099f7e40fd6f9df4dfeff49215af9a93f8fb Apr 23 17:54:43.904194 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.904158 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zmmnp"] Apr 23 17:54:43.908671 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:43.908643 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa5d0d2_4b4f_4472_b414_6aefb709735c.slice/crio-311e953d5d903c8d0aa9274be818f8870937a8862539a4beac89507222569680 WatchSource:0}: Error finding container 311e953d5d903c8d0aa9274be818f8870937a8862539a4beac89507222569680: Status 404 returned error can't find the container with id 311e953d5d903c8d0aa9274be818f8870937a8862539a4beac89507222569680 Apr 23 17:54:43.934546 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:43.934488 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dznpd"] Apr 23 17:54:43.937436 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:43.937405 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ae4d89_cf03_479c_935d_c2b46bb0082b.slice/crio-3628b1b7eb655d477efa0d541927eee8cbc57c2c80b2b62c59abc59d795d0363 WatchSource:0}: Error finding container 3628b1b7eb655d477efa0d541927eee8cbc57c2c80b2b62c59abc59d795d0363: Status 404 returned error can't find the container with id 3628b1b7eb655d477efa0d541927eee8cbc57c2c80b2b62c59abc59d795d0363 Apr 23 17:54:44.093433 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.093399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" Apr 23 17:54:44.218194 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.217997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc"] Apr 23 17:54:44.220489 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:44.220451 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4825f9f_c326_4675_901b_635a9bb75ddc.slice/crio-c4acf33cbf3987c3b3d88ba1c2b141085f5381cce7f126f8aa89a430dea844aa WatchSource:0}: Error finding container c4acf33cbf3987c3b3d88ba1c2b141085f5381cce7f126f8aa89a430dea844aa: Status 404 returned error can't find the container with id c4acf33cbf3987c3b3d88ba1c2b141085f5381cce7f126f8aa89a430dea844aa Apr 23 17:54:44.284849 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.284820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:44.285038 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.284857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:44.285038 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.284984 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:44.285038 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.284991 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 17:54:44.285038 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.285006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:44.285233 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.285043 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls podName:cbf092c0-733e-4d6b-a240-9d95ac93022a nodeName:}" failed. No retries permitted until 2026-04-23 17:54:45.285024613 +0000 UTC m=+72.830999961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qrm2m" (UID: "cbf092c0-733e-4d6b-a240-9d95ac93022a") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:44.285233 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.285076 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:45.285064571 +0000 UTC m=+72.831039931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : secret "router-metrics-certs-default" not found Apr 23 17:54:44.285233 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.285080 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:44.285233 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.285092 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-669c6dbffc-ntp4x: secret "image-registry-tls" not found Apr 23 17:54:44.285233 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.285131 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls podName:d439c95a-c193-4b86-a761-5d34ccc0e57d nodeName:}" failed. No retries permitted until 2026-04-23 17:54:45.285120267 +0000 UTC m=+72.831095616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls") pod "image-registry-669c6dbffc-ntp4x" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d") : secret "image-registry-tls" not found Apr 23 17:54:44.285233 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.285156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:44.285511 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:44.285241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:45.285233644 +0000 UTC m=+72.831208988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : configmap references non-existent config key: service-ca.crt Apr 23 17:54:44.297655 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.297623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" event={"ID":"c4825f9f-c326-4675-901b-635a9bb75ddc","Type":"ContainerStarted","Data":"c4acf33cbf3987c3b3d88ba1c2b141085f5381cce7f126f8aa89a430dea844aa"} Apr 23 17:54:44.298566 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.298544 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" event={"ID":"50ae4d89-cf03-479c-935d-c2b46bb0082b","Type":"ContainerStarted","Data":"3628b1b7eb655d477efa0d541927eee8cbc57c2c80b2b62c59abc59d795d0363"} Apr 23 17:54:44.299601 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.299582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" event={"ID":"ffa5d0d2-4b4f-4472-b414-6aefb709735c","Type":"ContainerStarted","Data":"311e953d5d903c8d0aa9274be818f8870937a8862539a4beac89507222569680"} Apr 23 17:54:44.300552 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:44.300534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" event={"ID":"e7f277ff-c690-4837-8465-5d845f4c966b","Type":"ContainerStarted","Data":"48ec35178a3dc104200290c11ba3099f7e40fd6f9df4dfeff49215af9a93f8fb"} Apr 23 17:54:45.295147 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:45.295093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:45.295737 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:45.295196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:45.295737 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:45.295262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:45.295737 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:45.295296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:45.295737 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.295450 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:45.295737 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.295519 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls podName:cbf092c0-733e-4d6b-a240-9d95ac93022a nodeName:}" failed. No retries permitted until 2026-04-23 17:54:47.295499963 +0000 UTC m=+74.841475329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qrm2m" (UID: "cbf092c0-733e-4d6b-a240-9d95ac93022a") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:45.296054 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.295957 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:45.296054 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.295975 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-669c6dbffc-ntp4x: secret "image-registry-tls" not found Apr 23 17:54:45.296054 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.296024 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls podName:d439c95a-c193-4b86-a761-5d34ccc0e57d nodeName:}" failed. No retries permitted until 2026-04-23 17:54:47.296007148 +0000 UTC m=+74.841982511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls") pod "image-registry-669c6dbffc-ntp4x" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d") : secret "image-registry-tls" not found Apr 23 17:54:45.296214 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.296095 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:47.296084882 +0000 UTC m=+74.842060230 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : configmap references non-existent config key: service-ca.crt Apr 23 17:54:45.296214 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.296153 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 17:54:45.296214 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:45.296190 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:47.296178456 +0000 UTC m=+74.842153806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : secret "router-metrics-certs-default" not found Apr 23 17:54:47.311833 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:47.311787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:47.311863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:47.311907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:47.311958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312004 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:51.31198089 +0000 UTC m=+78.857956248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : configmap references non-existent config key: service-ca.crt Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312057 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312109 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:51.312095074 +0000 UTC m=+78.858070423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : secret "router-metrics-certs-default" not found Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312064 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312128 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312131 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-669c6dbffc-ntp4x: secret "image-registry-tls" not found Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312160 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls podName:d439c95a-c193-4b86-a761-5d34ccc0e57d nodeName:}" failed. No retries permitted until 2026-04-23 17:54:51.312153394 +0000 UTC m=+78.858128742 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls") pod "image-registry-669c6dbffc-ntp4x" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d") : secret "image-registry-tls" not found Apr 23 17:54:47.312306 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:47.312177 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls podName:cbf092c0-733e-4d6b-a240-9d95ac93022a nodeName:}" failed. No retries permitted until 2026-04-23 17:54:51.31216431 +0000 UTC m=+78.858139655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qrm2m" (UID: "cbf092c0-733e-4d6b-a240-9d95ac93022a") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:48.312225 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.312193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/0.log" Apr 23 17:54:48.312749 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.312247 2575 generic.go:358] "Generic (PLEG): container finished" podID="50ae4d89-cf03-479c-935d-c2b46bb0082b" containerID="31fafab30a278d5bcfb630e1fd084e31eeaa38f078bf198c378bd93cb657464a" exitCode=255 Apr 23 17:54:48.312749 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.312342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" event={"ID":"50ae4d89-cf03-479c-935d-c2b46bb0082b","Type":"ContainerDied","Data":"31fafab30a278d5bcfb630e1fd084e31eeaa38f078bf198c378bd93cb657464a"} Apr 23 17:54:48.312936 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.312919 2575 scope.go:117] "RemoveContainer" containerID="31fafab30a278d5bcfb630e1fd084e31eeaa38f078bf198c378bd93cb657464a" Apr 23 17:54:48.314072 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.314023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" event={"ID":"ffa5d0d2-4b4f-4472-b414-6aefb709735c","Type":"ContainerStarted","Data":"8ef962fc19d5d548ce3d46afb727213815d89516597e09072244b331a60422ba"} Apr 23 17:54:48.315656 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.315615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" event={"ID":"e7f277ff-c690-4837-8465-5d845f4c966b","Type":"ContainerStarted","Data":"69f0125e720380d608b09b18ab8eec2a2d64b98eb08fa2be011b332d003cdba1"} Apr 23 17:54:48.317072 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.317032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" event={"ID":"c4825f9f-c326-4675-901b-635a9bb75ddc","Type":"ContainerStarted","Data":"7f72b457ebcd9be371a2eefe5ff8ed9fcec84dfea8cea5d9ac19806dfb21cd01"} Apr 23 17:54:48.361342 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.361283 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" podStartSLOduration=1.823619232 podStartE2EDuration="5.361261876s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:54:43.895361425 +0000 UTC m=+71.441336784" lastFinishedPulling="2026-04-23 17:54:47.433004066 +0000 UTC m=+74.978979428" observedRunningTime="2026-04-23 17:54:48.360209516 +0000 UTC m=+75.906184882" watchObservedRunningTime="2026-04-23 17:54:48.361261876 +0000 UTC m=+75.907237244" Apr 23 17:54:48.382588 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.382535 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" podStartSLOduration=1.861025303 podStartE2EDuration="5.382517284s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:54:43.911101595 +0000 UTC m=+71.457076946" lastFinishedPulling="2026-04-23 17:54:47.432593579 +0000 UTC m=+74.978568927" observedRunningTime="2026-04-23 17:54:48.381281947 +0000 UTC m=+75.927257316" watchObservedRunningTime="2026-04-23 17:54:48.382517284 +0000 UTC m=+75.928492653" Apr 23 17:54:48.408520 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.408462 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" podStartSLOduration=2.19233106 podStartE2EDuration="5.408411179s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:54:44.222265418 +0000 UTC m=+71.768240764" lastFinishedPulling="2026-04-23 17:54:47.438345525 +0000 UTC m=+74.984320883" observedRunningTime="2026-04-23 17:54:48.406770981 +0000 UTC m=+75.952746359" watchObservedRunningTime="2026-04-23 17:54:48.408411179 +0000 UTC m=+75.954386554" Apr 23 17:54:48.548271 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.548233 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7"] Apr 23 17:54:48.551221 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.551204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" Apr 23 17:54:48.553443 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.553420 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:48.553520 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.553425 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zbkts\"" Apr 23 17:54:48.553761 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.553745 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 17:54:48.562732 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.562678 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7"] Apr 23 17:54:48.622626 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.622589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnlp\" (UniqueName: \"kubernetes.io/projected/600a1b41-9d90-4293-9c00-69bd81f7363a-kube-api-access-kqnlp\") pod \"migrator-74bb7799d9-6wwq7\" (UID: \"600a1b41-9d90-4293-9c00-69bd81f7363a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" Apr 23 17:54:48.723977 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.723944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnlp\" (UniqueName: \"kubernetes.io/projected/600a1b41-9d90-4293-9c00-69bd81f7363a-kube-api-access-kqnlp\") pod \"migrator-74bb7799d9-6wwq7\" (UID: \"600a1b41-9d90-4293-9c00-69bd81f7363a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" Apr 23 17:54:48.732827 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.732798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnlp\" (UniqueName: \"kubernetes.io/projected/600a1b41-9d90-4293-9c00-69bd81f7363a-kube-api-access-kqnlp\") pod \"migrator-74bb7799d9-6wwq7\" (UID: \"600a1b41-9d90-4293-9c00-69bd81f7363a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" Apr 23 17:54:48.860289 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.860200 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" Apr 23 17:54:48.988982 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:48.988949 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7"] Apr 23 17:54:48.993165 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:48.993137 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod600a1b41_9d90_4293_9c00_69bd81f7363a.slice/crio-df0707c550d8bba12b952131e0ea3dc6155d365377d0566dcf6cc003a663ed42 WatchSource:0}: Error finding container df0707c550d8bba12b952131e0ea3dc6155d365377d0566dcf6cc003a663ed42: Status 404 returned error can't find the container with id df0707c550d8bba12b952131e0ea3dc6155d365377d0566dcf6cc003a663ed42 Apr 23 17:54:49.321325 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.321293 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/1.log" Apr 23 17:54:49.321743 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.321728 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/0.log" Apr 23 17:54:49.321785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.321763 2575 generic.go:358] "Generic (PLEG): container finished" podID="50ae4d89-cf03-479c-935d-c2b46bb0082b" containerID="e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27" exitCode=255 Apr 23 17:54:49.321863 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.321841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" event={"ID":"50ae4d89-cf03-479c-935d-c2b46bb0082b","Type":"ContainerDied","Data":"e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27"} Apr 23 17:54:49.322014 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.321905 2575 scope.go:117] "RemoveContainer" containerID="31fafab30a278d5bcfb630e1fd084e31eeaa38f078bf198c378bd93cb657464a" Apr 23 17:54:49.322172 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.322157 2575 scope.go:117] "RemoveContainer" containerID="e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27" Apr 23 17:54:49.322341 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:49.322325 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:54:49.323444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.323416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" event={"ID":"600a1b41-9d90-4293-9c00-69bd81f7363a","Type":"ContainerStarted","Data":"df0707c550d8bba12b952131e0ea3dc6155d365377d0566dcf6cc003a663ed42"} Apr 23 17:54:49.944666 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:49.944638 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-svgk4_9bb70f31-0e60-414a-ac60-8535df8b1ed1/dns-node-resolver/0.log" Apr 23 17:54:50.327683 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:50.327647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" event={"ID":"600a1b41-9d90-4293-9c00-69bd81f7363a","Type":"ContainerStarted","Data":"71521a025a5361209b2dd71703f652bbd3338882d4c939b9132ef1d3510f40da"} Apr 23 17:54:50.328169 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:50.327694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" event={"ID":"600a1b41-9d90-4293-9c00-69bd81f7363a","Type":"ContainerStarted","Data":"6a112bfef5ea8b391747a3a6d87d3cca47bc7e75db99c1953cb75139b96b15b0"} Apr 23 17:54:50.329028 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:50.329013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/1.log" Apr 23 17:54:50.329334 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:50.329318 2575 scope.go:117] "RemoveContainer" containerID="e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27" Apr 23 17:54:50.329511 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:50.329491 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:54:50.391106 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:50.391053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6wwq7" podStartSLOduration=1.254472535 podStartE2EDuration="2.391037101s" podCreationTimestamp="2026-04-23 17:54:48 +0000 UTC" firstStartedPulling="2026-04-23 17:54:48.995138889 +0000 UTC m=+76.541114235" lastFinishedPulling="2026-04-23 17:54:50.131703445 +0000 UTC m=+77.677678801" observedRunningTime="2026-04-23 17:54:50.36435113 +0000 UTC m=+77.910326497" watchObservedRunningTime="2026-04-23 17:54:50.391037101 +0000 UTC m=+77.937012467" Apr 23 17:54:50.744115 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:50.744088 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vnvg4_69d66352-58de-4bf6-88d8-b5603ccbe8af/node-ca/0.log" Apr 23 17:54:51.345640 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:51.345604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:51.345664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:51.345709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:51.345728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345744 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345764 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-669c6dbffc-ntp4x: secret "image-registry-tls" not found Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345811 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls podName:d439c95a-c193-4b86-a761-5d34ccc0e57d nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.345795567 +0000 UTC m=+86.891770913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls") pod "image-registry-669c6dbffc-ntp4x" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d") : secret "image-registry-tls" not found Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345847 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.345830744 +0000 UTC m=+86.891806113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : configmap references non-existent config key: service-ca.crt Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345853 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345865 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345926 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls podName:cbf092c0-733e-4d6b-a240-9d95ac93022a nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.345913594 +0000 UTC m=+86.891888953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qrm2m" (UID: "cbf092c0-733e-4d6b-a240-9d95ac93022a") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:51.346016 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:51.345941 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.345934785 +0000 UTC m=+86.891910134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : secret "router-metrics-certs-default" not found Apr 23 17:54:53.782997 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:53.782960 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:53.782997 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:53.783006 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:54:53.783388 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:53.783355 2575 scope.go:117] "RemoveContainer" containerID="e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27" Apr 23 17:54:53.783535 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:53.783517 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:54:59.414715 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.414664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:59.414715 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.414723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.414758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:59.414789 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.414804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:59.414865 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:55:15.414849303 +0000 UTC m=+102.960824652 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : secret "router-metrics-certs-default" not found Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:59.414917 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:59.414956 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle podName:9ae6b02c-1ffb-4822-950c-d4782e47731b nodeName:}" failed. No retries permitted until 2026-04-23 17:55:15.414933618 +0000 UTC m=+102.960908986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle") pod "router-default-79996f959d-wvgtd" (UID: "9ae6b02c-1ffb-4822-950c-d4782e47731b") : configmap references non-existent config key: service-ca.crt Apr 23 17:54:59.415210 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:54:59.414986 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls podName:cbf092c0-733e-4d6b-a240-9d95ac93022a nodeName:}" failed. No retries permitted until 2026-04-23 17:55:15.414974497 +0000 UTC m=+102.960949842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qrm2m" (UID: "cbf092c0-733e-4d6b-a240-9d95ac93022a") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:54:59.417386 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.417368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"image-registry-669c6dbffc-ntp4x\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:59.482480 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.482340 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:54:59.622282 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:54:59.622246 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-669c6dbffc-ntp4x"] Apr 23 17:54:59.626725 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:54:59.626687 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd439c95a_c193_4b86_a761_5d34ccc0e57d.slice/crio-7ea3230e5212acd692dbdaade9e7db0fe3dbfb3d2389220d4ab2e68cdc3d68d6 WatchSource:0}: Error finding container 7ea3230e5212acd692dbdaade9e7db0fe3dbfb3d2389220d4ab2e68cdc3d68d6: Status 404 returned error can't find the container with id 7ea3230e5212acd692dbdaade9e7db0fe3dbfb3d2389220d4ab2e68cdc3d68d6 Apr 23 17:55:00.355306 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:00.355269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" event={"ID":"d439c95a-c193-4b86-a761-5d34ccc0e57d","Type":"ContainerStarted","Data":"706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0"} Apr 23 17:55:00.355306 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:00.355311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" event={"ID":"d439c95a-c193-4b86-a761-5d34ccc0e57d","Type":"ContainerStarted","Data":"7ea3230e5212acd692dbdaade9e7db0fe3dbfb3d2389220d4ab2e68cdc3d68d6"} Apr 23 17:55:00.355509 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:00.355342 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:55:00.377333 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:00.377272 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" podStartSLOduration=17.377251419 podStartE2EDuration="17.377251419s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:00.377251662 +0000 UTC m=+87.923227041" watchObservedRunningTime="2026-04-23 17:55:00.377251419 +0000 UTC m=+87.923226790" Apr 23 17:55:08.057658 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.057627 2575 scope.go:117] "RemoveContainer" containerID="e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27" Apr 23 17:55:08.375607 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.375532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 17:55:08.375918 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.375902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/1.log" Apr 23 17:55:08.375974 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.375937 2575 generic.go:358] "Generic (PLEG): container finished" podID="50ae4d89-cf03-479c-935d-c2b46bb0082b" containerID="bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94" exitCode=255 Apr 23 17:55:08.376026 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.375972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" event={"ID":"50ae4d89-cf03-479c-935d-c2b46bb0082b","Type":"ContainerDied","Data":"bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94"} Apr 23 17:55:08.376026 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.376000 2575 scope.go:117] "RemoveContainer" containerID="e4576cc80dbb5c4758a0389a67d0d44a929820d397d0c351bb6b364de6825d27" Apr 23 17:55:08.376379 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:08.376353 2575 scope.go:117] "RemoveContainer" containerID="bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94" Apr 23 17:55:08.376622 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:55:08.376589 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:55:09.380159 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:09.380130 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 17:55:11.317693 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.317655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:55:11.318204 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.317754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:55:11.320190 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.320169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57787a9-765e-4e66-b8bd-e8c50eaf8977-metrics-tls\") pod \"dns-default-swk4j\" (UID: \"a57787a9-765e-4e66-b8bd-e8c50eaf8977\") " pod="openshift-dns/dns-default-swk4j" Apr 23 17:55:11.320335 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.320314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6a9e60b-54f2-4766-83e3-2d825df0b4f0-cert\") pod \"ingress-canary-tld98\" (UID: \"b6a9e60b-54f2-4766-83e3-2d825df0b4f0\") " pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:55:11.587325 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.587236 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nr8tq\"" Apr 23 17:55:11.594956 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.594926 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jg9fn\"" Apr 23 17:55:11.595088 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.594997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-swk4j" Apr 23 17:55:11.603746 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.603718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tld98" Apr 23 17:55:11.722553 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.722521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-swk4j"] Apr 23 17:55:11.725955 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:11.725928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57787a9_765e_4e66_b8bd_e8c50eaf8977.slice/crio-f7f50fa1add53483dde15fd852c6aa1b524fab9e70cc731c191a206b19f454ae WatchSource:0}: Error finding container f7f50fa1add53483dde15fd852c6aa1b524fab9e70cc731c191a206b19f454ae: Status 404 returned error can't find the container with id f7f50fa1add53483dde15fd852c6aa1b524fab9e70cc731c191a206b19f454ae Apr 23 17:55:11.743867 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:11.743842 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tld98"] Apr 23 17:55:11.746501 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:11.746472 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6a9e60b_54f2_4766_83e3_2d825df0b4f0.slice/crio-7e150727f8e20dbb5f3afb136b84c994af2b4326cc3444beb82a29e1bb89888c WatchSource:0}: Error finding container 7e150727f8e20dbb5f3afb136b84c994af2b4326cc3444beb82a29e1bb89888c: Status 404 returned error can't find the container with id 7e150727f8e20dbb5f3afb136b84c994af2b4326cc3444beb82a29e1bb89888c Apr 23 17:55:12.391751 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:12.391713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tld98" event={"ID":"b6a9e60b-54f2-4766-83e3-2d825df0b4f0","Type":"ContainerStarted","Data":"7e150727f8e20dbb5f3afb136b84c994af2b4326cc3444beb82a29e1bb89888c"} Apr 23 17:55:12.393288 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:12.393249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-swk4j" event={"ID":"a57787a9-765e-4e66-b8bd-e8c50eaf8977","Type":"ContainerStarted","Data":"f7f50fa1add53483dde15fd852c6aa1b524fab9e70cc731c191a206b19f454ae"} Apr 23 17:55:13.297643 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:13.297612 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bc6lb" Apr 23 17:55:13.783927 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:13.782797 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:55:13.783927 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:13.783193 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:55:13.783927 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:13.783271 2575 scope.go:117] "RemoveContainer" containerID="bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94" Apr 23 17:55:13.783927 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:55:13.783492 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:55:14.400576 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.400533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-swk4j" event={"ID":"a57787a9-765e-4e66-b8bd-e8c50eaf8977","Type":"ContainerStarted","Data":"770017c5e9882dd4dbe1491169ae8abfacede7a369c50f89dbd163f8b5c678cd"} Apr 23 17:55:14.400576 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.400586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-swk4j" event={"ID":"a57787a9-765e-4e66-b8bd-e8c50eaf8977","Type":"ContainerStarted","Data":"a82cd5d06243060d5f9174f5ff92573a556485dcc5d50a03171d5bddb082b485"} Apr 23 17:55:14.400825 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.400640 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-swk4j" Apr 23 17:55:14.401841 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.401815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tld98" event={"ID":"b6a9e60b-54f2-4766-83e3-2d825df0b4f0","Type":"ContainerStarted","Data":"e6597b20f1de5d9ae21c91e345b29a0a40939f956a8118f1336da4b097c339d7"} Apr 23 17:55:14.402087 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.402071 2575 scope.go:117] "RemoveContainer" containerID="bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94" Apr 23 17:55:14.402253 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:55:14.402234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:55:14.418333 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.418291 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-swk4j" podStartSLOduration=65.46309458 podStartE2EDuration="1m7.418280704s" podCreationTimestamp="2026-04-23 17:54:07 +0000 UTC" firstStartedPulling="2026-04-23 17:55:11.728002669 +0000 UTC m=+99.273978014" lastFinishedPulling="2026-04-23 17:55:13.683188779 +0000 UTC m=+101.229164138" observedRunningTime="2026-04-23 17:55:14.417680999 +0000 UTC m=+101.963656367" watchObservedRunningTime="2026-04-23 17:55:14.418280704 +0000 UTC m=+101.964256071" Apr 23 17:55:14.432682 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:14.432637 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tld98" podStartSLOduration=65.494789607 podStartE2EDuration="1m7.43262549s" podCreationTimestamp="2026-04-23 17:54:07 +0000 UTC" firstStartedPulling="2026-04-23 17:55:11.748358928 +0000 UTC m=+99.294334277" lastFinishedPulling="2026-04-23 17:55:13.686194815 +0000 UTC m=+101.232170160" observedRunningTime="2026-04-23 17:55:14.432441206 +0000 UTC m=+101.978416573" watchObservedRunningTime="2026-04-23 17:55:14.43262549 +0000 UTC m=+101.978600835" Apr 23 17:55:15.452369 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.452327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:15.452749 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.452385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:15.452749 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.452405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:55:15.453766 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.453738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae6b02c-1ffb-4822-950c-d4782e47731b-service-ca-bundle\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:15.454962 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.454935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf092c0-733e-4d6b-a240-9d95ac93022a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qrm2m\" (UID: \"cbf092c0-733e-4d6b-a240-9d95ac93022a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:55:15.454962 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.454949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ae6b02c-1ffb-4822-950c-d4782e47731b-metrics-certs\") pod \"router-default-79996f959d-wvgtd\" (UID: \"9ae6b02c-1ffb-4822-950c-d4782e47731b\") " pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:15.615133 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.615082 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:15.664209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.663789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" Apr 23 17:55:15.770127 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.770095 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79996f959d-wvgtd"] Apr 23 17:55:15.773950 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:15.773908 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ae6b02c_1ffb_4822_950c_d4782e47731b.slice/crio-da505666fa49fd63eeb160ae29dd685b8133f9c9e51632a340b8a9e4a0fb5a2c WatchSource:0}: Error finding container da505666fa49fd63eeb160ae29dd685b8133f9c9e51632a340b8a9e4a0fb5a2c: Status 404 returned error can't find the container with id da505666fa49fd63eeb160ae29dd685b8133f9c9e51632a340b8a9e4a0fb5a2c Apr 23 17:55:15.827916 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.827874 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5fjbp"] Apr 23 17:55:15.832922 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.832875 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:15.834927 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.834866 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:55:15.835037 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.835016 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2kkk6\"" Apr 23 17:55:15.835237 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.835219 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:55:15.844508 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.844487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m"] Apr 23 17:55:15.847505 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:15.847468 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf092c0_733e_4d6b_a240_9d95ac93022a.slice/crio-6157f67be6e075406e6d56c1ac872457276cbe2fa816f482a53870dd87b88cc2 WatchSource:0}: Error finding container 6157f67be6e075406e6d56c1ac872457276cbe2fa816f482a53870dd87b88cc2: Status 404 returned error can't find the container with id 6157f67be6e075406e6d56c1ac872457276cbe2fa816f482a53870dd87b88cc2 Apr 23 17:55:15.849470 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.849434 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5fjbp"] Apr 23 17:55:15.860356 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.860330 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-669c6dbffc-ntp4x"] Apr 23 17:55:15.864844 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.864813 2575 patch_prober.go:28] interesting pod/image-registry-669c6dbffc-ntp4x container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:55:15.864966 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.864864 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" podUID="d439c95a-c193-4b86-a761-5d34ccc0e57d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:55:15.923578 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.923543 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c45cc5f59-s6pc2"] Apr 23 17:55:15.926922 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.926906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:15.945841 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.945815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c45cc5f59-s6pc2"] Apr 23 17:55:15.957036 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.956968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5761c70c-3274-4099-8b36-5bfb025c9803-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:15.957036 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.957020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5761c70c-3274-4099-8b36-5bfb025c9803-data-volume\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:15.957209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.957120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5761c70c-3274-4099-8b36-5bfb025c9803-crio-socket\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:15.957209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.957176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5761c70c-3274-4099-8b36-5bfb025c9803-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:15.957209 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:15.957202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf6vb\" (UniqueName: \"kubernetes.io/projected/5761c70c-3274-4099-8b36-5bfb025c9803-kube-api-access-cf6vb\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.057532 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5761c70c-3274-4099-8b36-5bfb025c9803-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.057700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59da388f-0cf0-4d97-8756-b51b61e9316c-installation-pull-secrets\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.057700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5761c70c-3274-4099-8b36-5bfb025c9803-data-volume\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.057700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59da388f-0cf0-4d97-8756-b51b61e9316c-trusted-ca\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.057700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5761c70c-3274-4099-8b36-5bfb025c9803-crio-socket\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.057700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-bound-sa-token\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.057700 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057696 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59da388f-0cf0-4d97-8756-b51b61e9316c-image-registry-private-configuration\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59da388f-0cf0-4d97-8756-b51b61e9316c-ca-trust-extracted\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5761c70c-3274-4099-8b36-5bfb025c9803-crio-socket\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68fr7\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-kube-api-access-68fr7\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5761c70c-3274-4099-8b36-5bfb025c9803-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf6vb\" (UniqueName: \"kubernetes.io/projected/5761c70c-3274-4099-8b36-5bfb025c9803-kube-api-access-cf6vb\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-registry-tls\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59da388f-0cf0-4d97-8756-b51b61e9316c-registry-certificates\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.058020 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.057984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5761c70c-3274-4099-8b36-5bfb025c9803-data-volume\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.058428 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.058386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5761c70c-3274-4099-8b36-5bfb025c9803-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.059972 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.059955 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5761c70c-3274-4099-8b36-5bfb025c9803-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.067938 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.067920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf6vb\" (UniqueName: \"kubernetes.io/projected/5761c70c-3274-4099-8b36-5bfb025c9803-kube-api-access-cf6vb\") pod \"insights-runtime-extractor-5fjbp\" (UID: \"5761c70c-3274-4099-8b36-5bfb025c9803\") " pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.144984 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.144949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5fjbp" Apr 23 17:55:16.158983 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.158938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59da388f-0cf0-4d97-8756-b51b61e9316c-installation-pull-secrets\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159116 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.158989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59da388f-0cf0-4d97-8756-b51b61e9316c-trusted-ca\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159116 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-bound-sa-token\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159116 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59da388f-0cf0-4d97-8756-b51b61e9316c-image-registry-private-configuration\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59da388f-0cf0-4d97-8756-b51b61e9316c-ca-trust-extracted\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68fr7\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-kube-api-access-68fr7\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-registry-tls\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59da388f-0cf0-4d97-8756-b51b61e9316c-registry-certificates\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.159684 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59da388f-0cf0-4d97-8756-b51b61e9316c-ca-trust-extracted\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.160018 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.159997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59da388f-0cf0-4d97-8756-b51b61e9316c-registry-certificates\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.160826 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.160799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59da388f-0cf0-4d97-8756-b51b61e9316c-trusted-ca\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.161702 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.161668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59da388f-0cf0-4d97-8756-b51b61e9316c-installation-pull-secrets\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.161843 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.161823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59da388f-0cf0-4d97-8756-b51b61e9316c-image-registry-private-configuration\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.162142 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.162121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-registry-tls\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.167602 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.167580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-bound-sa-token\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.167842 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.167823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68fr7\" (UniqueName: \"kubernetes.io/projected/59da388f-0cf0-4d97-8756-b51b61e9316c-kube-api-access-68fr7\") pod \"image-registry-c45cc5f59-s6pc2\" (UID: \"59da388f-0cf0-4d97-8756-b51b61e9316c\") " pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.235320 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.235279 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:16.280959 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.280711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5fjbp"] Apr 23 17:55:16.393252 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.393214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c45cc5f59-s6pc2"] Apr 23 17:55:16.396518 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:16.396476 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59da388f_0cf0_4d97_8756_b51b61e9316c.slice/crio-44236a677d53cedae9924e741d82566a15f3e63c78f26f2c202686986ac3b0a4 WatchSource:0}: Error finding container 44236a677d53cedae9924e741d82566a15f3e63c78f26f2c202686986ac3b0a4: Status 404 returned error can't find the container with id 44236a677d53cedae9924e741d82566a15f3e63c78f26f2c202686986ac3b0a4 Apr 23 17:55:16.408709 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.408673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" event={"ID":"59da388f-0cf0-4d97-8756-b51b61e9316c","Type":"ContainerStarted","Data":"44236a677d53cedae9924e741d82566a15f3e63c78f26f2c202686986ac3b0a4"} Apr 23 17:55:16.409830 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.409801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" event={"ID":"cbf092c0-733e-4d6b-a240-9d95ac93022a","Type":"ContainerStarted","Data":"6157f67be6e075406e6d56c1ac872457276cbe2fa816f482a53870dd87b88cc2"} Apr 23 17:55:16.411409 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.411384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79996f959d-wvgtd" event={"ID":"9ae6b02c-1ffb-4822-950c-d4782e47731b","Type":"ContainerStarted","Data":"b33d578cb7f85233875c38b21b0d78bc2a200bc9255dc9aacaa95289a00121fe"} Apr 23 17:55:16.411509 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.411417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79996f959d-wvgtd" event={"ID":"9ae6b02c-1ffb-4822-950c-d4782e47731b","Type":"ContainerStarted","Data":"da505666fa49fd63eeb160ae29dd685b8133f9c9e51632a340b8a9e4a0fb5a2c"} Apr 23 17:55:16.413342 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.413315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fjbp" event={"ID":"5761c70c-3274-4099-8b36-5bfb025c9803","Type":"ContainerStarted","Data":"1dfd648813850174ff507d0632dc2dde92ca254cc6b9ac27eb7068b1c4d64916"} Apr 23 17:55:16.413342 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.413341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fjbp" event={"ID":"5761c70c-3274-4099-8b36-5bfb025c9803","Type":"ContainerStarted","Data":"d10b7b373ef67a86ef7b9f543cd0aa4b31bda95187eff021015a879ed1be16b0"} Apr 23 17:55:16.436056 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.435836 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79996f959d-wvgtd" podStartSLOduration=33.435813972 podStartE2EDuration="33.435813972s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:16.433984901 +0000 UTC m=+103.979960268" watchObservedRunningTime="2026-04-23 17:55:16.435813972 +0000 UTC m=+103.981789340" Apr 23 17:55:16.615619 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.615529 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:16.618677 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:16.618652 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:17.417249 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:17.417212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" event={"ID":"59da388f-0cf0-4d97-8756-b51b61e9316c","Type":"ContainerStarted","Data":"115922449b1c65a17f0e4db610b27ca879f77dc9dc80f0b9084880d550c1e06f"} Apr 23 17:55:17.417440 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:17.417336 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:17.418696 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:17.418675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fjbp" event={"ID":"5761c70c-3274-4099-8b36-5bfb025c9803","Type":"ContainerStarted","Data":"fd9afb261d4d6e99cb7c0b917fda1ace3ad9e4443050c721baf5b09378e875c4"} Apr 23 17:55:17.418847 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:17.418836 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:17.420081 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:17.420061 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79996f959d-wvgtd" Apr 23 17:55:17.442276 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:17.442236 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" podStartSLOduration=2.442222007 podStartE2EDuration="2.442222007s" podCreationTimestamp="2026-04-23 17:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:17.441098935 +0000 UTC m=+104.987074302" watchObservedRunningTime="2026-04-23 17:55:17.442222007 +0000 UTC m=+104.988197352" Apr 23 17:55:18.422829 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:18.422785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" event={"ID":"cbf092c0-733e-4d6b-a240-9d95ac93022a","Type":"ContainerStarted","Data":"30b17e7a14f3592f1dc14fb419712c0f4924a7421246ac9f95aee58d4cf2bc26"} Apr 23 17:55:18.447364 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:18.447304 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qrm2m" podStartSLOduration=33.855089434 podStartE2EDuration="35.447287993s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:55:15.850623723 +0000 UTC m=+103.396599067" lastFinishedPulling="2026-04-23 17:55:17.442822267 +0000 UTC m=+104.988797626" observedRunningTime="2026-04-23 17:55:18.446135744 +0000 UTC m=+105.992111121" watchObservedRunningTime="2026-04-23 17:55:18.447287993 +0000 UTC m=+105.993263359" Apr 23 17:55:19.431220 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:19.431176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fjbp" event={"ID":"5761c70c-3274-4099-8b36-5bfb025c9803","Type":"ContainerStarted","Data":"ec7842c3eefe075c5c43373b61f56c8cf64e2c80ffdb84e17f120d544e0f2d1d"} Apr 23 17:55:19.455613 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:19.455497 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5fjbp" podStartSLOduration=2.362701425 podStartE2EDuration="4.45547984s" podCreationTimestamp="2026-04-23 17:55:15 +0000 UTC" firstStartedPulling="2026-04-23 17:55:16.34981166 +0000 UTC m=+103.895787022" lastFinishedPulling="2026-04-23 17:55:18.442590092 +0000 UTC m=+105.988565437" observedRunningTime="2026-04-23 17:55:19.455051954 +0000 UTC m=+107.001027331" watchObservedRunningTime="2026-04-23 17:55:19.45547984 +0000 UTC m=+107.001455207" Apr 23 17:55:24.407190 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:24.407155 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-swk4j" Apr 23 17:55:25.057601 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:25.057560 2575 scope.go:117] "RemoveContainer" containerID="bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94" Apr 23 17:55:25.057767 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:55:25.057732 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dznpd_openshift-console-operator(50ae4d89-cf03-479c-935d-c2b46bb0082b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podUID="50ae4d89-cf03-479c-935d-c2b46bb0082b" Apr 23 17:55:25.865013 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:25.864980 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:55:26.630703 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.630667 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xnt4q"] Apr 23 17:55:26.635959 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.635937 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.639312 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.639289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:55:26.639457 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.639374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:55:26.640025 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.640008 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:55:26.640477 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.640456 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:55:26.640571 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.640479 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-l9gtc\"" Apr 23 17:55:26.744844 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.744811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.744844 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.744857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-wtmp\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745074 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.744902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-sys\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745074 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.744948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93846088-277d-4894-b6ef-cc87f01ad6fa-metrics-client-ca\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745074 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.744963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-tls\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745074 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.745024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdhb\" (UniqueName: \"kubernetes.io/projected/93846088-277d-4894-b6ef-cc87f01ad6fa-kube-api-access-5kdhb\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745074 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.745068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-textfile\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745228 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.745099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.745228 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.745122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-root\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.845817 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.845784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93846088-277d-4894-b6ef-cc87f01ad6fa-metrics-client-ca\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.845817 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.845817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-tls\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846080 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.845847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdhb\" (UniqueName: \"kubernetes.io/projected/93846088-277d-4894-b6ef-cc87f01ad6fa-kube-api-access-5kdhb\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846080 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.845990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-textfile\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846080 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846080 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-root\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-wtmp\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-sys\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-root\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846480 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-sys\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846480 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-textfile\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846480 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-wtmp\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846615 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93846088-277d-4894-b6ef-cc87f01ad6fa-metrics-client-ca\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.846649 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.846608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.848403 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.848380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-tls\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.848515 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.848498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93846088-277d-4894-b6ef-cc87f01ad6fa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.855529 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.855510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdhb\" (UniqueName: \"kubernetes.io/projected/93846088-277d-4894-b6ef-cc87f01ad6fa-kube-api-access-5kdhb\") pod \"node-exporter-xnt4q\" (UID: \"93846088-277d-4894-b6ef-cc87f01ad6fa\") " pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.945940 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:26.945911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xnt4q" Apr 23 17:55:26.954157 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:26.954127 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93846088_277d_4894_b6ef_cc87f01ad6fa.slice/crio-b134231623936c319b3204d30b2afde5f19b097355d146d57fde41cd12f25157 WatchSource:0}: Error finding container b134231623936c319b3204d30b2afde5f19b097355d146d57fde41cd12f25157: Status 404 returned error can't find the container with id b134231623936c319b3204d30b2afde5f19b097355d146d57fde41cd12f25157 Apr 23 17:55:27.454207 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:27.454155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xnt4q" event={"ID":"93846088-277d-4894-b6ef-cc87f01ad6fa","Type":"ContainerStarted","Data":"b134231623936c319b3204d30b2afde5f19b097355d146d57fde41cd12f25157"} Apr 23 17:55:28.459222 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:28.459183 2575 generic.go:358] "Generic (PLEG): container finished" podID="93846088-277d-4894-b6ef-cc87f01ad6fa" containerID="ff240e8ce85e020af74e27f6752ba1fb175a36c0c98fc3bd88a62b53ea28c323" exitCode=0 Apr 23 17:55:28.459766 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:28.459300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xnt4q" event={"ID":"93846088-277d-4894-b6ef-cc87f01ad6fa","Type":"ContainerDied","Data":"ff240e8ce85e020af74e27f6752ba1fb175a36c0c98fc3bd88a62b53ea28c323"} Apr 23 17:55:29.468210 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.468175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xnt4q" event={"ID":"93846088-277d-4894-b6ef-cc87f01ad6fa","Type":"ContainerStarted","Data":"988e689ba6bbc2c539fc9b6105e4d6f54529e4f83c7b3c2ce06907dfec6087b4"} Apr 23 17:55:29.468210 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.468215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xnt4q" event={"ID":"93846088-277d-4894-b6ef-cc87f01ad6fa","Type":"ContainerStarted","Data":"1bfc70c7df9d3b6e01476adf90dc252922241b32c465128ebf303d744212b353"} Apr 23 17:55:29.721870 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.721755 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xnt4q" podStartSLOduration=2.963240544 podStartE2EDuration="3.7217353s" podCreationTimestamp="2026-04-23 17:55:26 +0000 UTC" firstStartedPulling="2026-04-23 17:55:26.95575499 +0000 UTC m=+114.501730335" lastFinishedPulling="2026-04-23 17:55:27.714249734 +0000 UTC m=+115.260225091" observedRunningTime="2026-04-23 17:55:29.498769837 +0000 UTC m=+117.044745204" watchObservedRunningTime="2026-04-23 17:55:29.7217353 +0000 UTC m=+117.267710667" Apr 23 17:55:29.722652 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.722629 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7556c7f998-x77kb"] Apr 23 17:55:29.726399 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.726379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.728465 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4cnn1llobf74p\"" Apr 23 17:55:29.728964 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728863 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 17:55:29.728964 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728866 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 17:55:29.728964 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-z9stg\"" Apr 23 17:55:29.729161 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 17:55:29.729161 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728937 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 17:55:29.729161 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.728938 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 17:55:29.739938 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.739919 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7556c7f998-x77kb"] Apr 23 17:55:29.872786 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.872747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873001 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.872822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-tls\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873001 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.872849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873001 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.872868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-grpc-tls\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873001 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.872913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873001 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.872959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873213 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.873010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e108b2f0-3753-4733-bedd-f9e769bbb345-metrics-client-ca\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.873213 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.873036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w596w\" (UniqueName: \"kubernetes.io/projected/e108b2f0-3753-4733-bedd-f9e769bbb345-kube-api-access-w596w\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.973996 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.973906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e108b2f0-3753-4733-bedd-f9e769bbb345-metrics-client-ca\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.973996 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.973947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w596w\" (UniqueName: \"kubernetes.io/projected/e108b2f0-3753-4733-bedd-f9e769bbb345-kube-api-access-w596w\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.973996 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.973984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.974251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.974057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-tls\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.974251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.974099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.974251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.974127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-grpc-tls\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.974251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.974161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.974251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.974188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.974714 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.974681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e108b2f0-3753-4733-bedd-f9e769bbb345-metrics-client-ca\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.977039 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.977009 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.977139 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.977040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-grpc-tls\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.977253 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.977231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-tls\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.977291 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.977261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.977329 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.977316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.977363 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.977326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e108b2f0-3753-4733-bedd-f9e769bbb345-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:29.983515 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:29.983496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w596w\" (UniqueName: \"kubernetes.io/projected/e108b2f0-3753-4733-bedd-f9e769bbb345-kube-api-access-w596w\") pod \"thanos-querier-7556c7f998-x77kb\" (UID: \"e108b2f0-3753-4733-bedd-f9e769bbb345\") " pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:30.035481 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:30.035442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:30.175434 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:30.175399 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7556c7f998-x77kb"] Apr 23 17:55:30.178813 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:30.178789 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode108b2f0_3753_4733_bedd_f9e769bbb345.slice/crio-a7c900c227dc32cfb415879fc412a74d0a1610bd455b8da8b7dc1823ea5b46d0 WatchSource:0}: Error finding container a7c900c227dc32cfb415879fc412a74d0a1610bd455b8da8b7dc1823ea5b46d0: Status 404 returned error can't find the container with id a7c900c227dc32cfb415879fc412a74d0a1610bd455b8da8b7dc1823ea5b46d0 Apr 23 17:55:30.472112 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:30.472076 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"a7c900c227dc32cfb415879fc412a74d0a1610bd455b8da8b7dc1823ea5b46d0"} Apr 23 17:55:31.050316 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.050280 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-667bd85767-4v446"] Apr 23 17:55:31.053594 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.053574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.057481 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.057457 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:55:31.057802 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.057784 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 17:55:31.058530 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.058310 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mg48t\"" Apr 23 17:55:31.058530 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.058358 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 17:55:31.058530 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.058378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 17:55:31.058530 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.058391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1uv3feon898dm\"" Apr 23 17:55:31.065974 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.065939 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-667bd85767-4v446"] Apr 23 17:55:31.185478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-secret-metrics-server-tls\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.185650 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-secret-metrics-server-client-certs\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.185650 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/eec30db5-c0af-4aa9-b807-89e196d3f094-audit-log\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.185650 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g4xx\" (UniqueName: \"kubernetes.io/projected/eec30db5-c0af-4aa9-b807-89e196d3f094-kube-api-access-9g4xx\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.185650 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/eec30db5-c0af-4aa9-b807-89e196d3f094-metrics-server-audit-profiles\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.185650 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eec30db5-c0af-4aa9-b807-89e196d3f094-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.185920 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.185714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-client-ca-bundle\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286487 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-secret-metrics-server-tls\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286487 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-secret-metrics-server-client-certs\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/eec30db5-c0af-4aa9-b807-89e196d3f094-audit-log\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g4xx\" (UniqueName: \"kubernetes.io/projected/eec30db5-c0af-4aa9-b807-89e196d3f094-kube-api-access-9g4xx\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/eec30db5-c0af-4aa9-b807-89e196d3f094-metrics-server-audit-profiles\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eec30db5-c0af-4aa9-b807-89e196d3f094-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.286716 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.286666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-client-ca-bundle\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.287069 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.287038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/eec30db5-c0af-4aa9-b807-89e196d3f094-audit-log\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.287943 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.287810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eec30db5-c0af-4aa9-b807-89e196d3f094-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.287943 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.287917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/eec30db5-c0af-4aa9-b807-89e196d3f094-metrics-server-audit-profiles\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.289326 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.289306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-secret-metrics-server-tls\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.289504 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.289487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-secret-metrics-server-client-certs\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.289990 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.289965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec30db5-c0af-4aa9-b807-89e196d3f094-client-ca-bundle\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.313763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.313681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g4xx\" (UniqueName: \"kubernetes.io/projected/eec30db5-c0af-4aa9-b807-89e196d3f094-kube-api-access-9g4xx\") pod \"metrics-server-667bd85767-4v446\" (UID: \"eec30db5-c0af-4aa9-b807-89e196d3f094\") " pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.366374 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.366343 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:31.844741 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.844710 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg"] Apr 23 17:55:31.848155 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.848137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.850592 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.850571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 17:55:31.850735 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.850716 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 17:55:31.850806 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.850720 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 17:55:31.850863 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.850823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 17:55:31.850942 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.850906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dpfqn\"" Apr 23 17:55:31.851218 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.851204 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 17:55:31.857536 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.857514 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 17:55:31.867110 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.867052 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg"] Apr 23 17:55:31.943594 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.943567 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-667bd85767-4v446"] Apr 23 17:55:31.945355 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:31.945327 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec30db5_c0af_4aa9_b807_89e196d3f094.slice/crio-207c22ffda8ea398a04e138c4c3ecad90a438e96bd0b30e4320163d3d31d6ca0 WatchSource:0}: Error finding container 207c22ffda8ea398a04e138c4c3ecad90a438e96bd0b30e4320163d3d31d6ca0: Status 404 returned error can't find the container with id 207c22ffda8ea398a04e138c4c3ecad90a438e96bd0b30e4320163d3d31d6ca0 Apr 23 17:55:31.993618 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-federate-client-tls\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993749 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-telemeter-client-tls\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993749 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-secret-telemeter-client\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993830 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993830 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993917 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn6js\" (UniqueName: \"kubernetes.io/projected/df4f2af2-d944-4a00-b116-126b71c159dc-kube-api-access-mn6js\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993917 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-metrics-client-ca\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:31.993977 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:31.993916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-serving-certs-ca-bundle\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.094955 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.094920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-secret-telemeter-client\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095072 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.094977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095157 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095196 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn6js\" (UniqueName: \"kubernetes.io/projected/df4f2af2-d944-4a00-b116-126b71c159dc-kube-api-access-mn6js\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095239 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-metrics-client-ca\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095287 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-serving-certs-ca-bundle\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095340 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-federate-client-tls\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.095340 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-telemeter-client-tls\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.096179 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.095956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.096179 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.096134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-metrics-client-ca\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.096412 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.096390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4f2af2-d944-4a00-b116-126b71c159dc-serving-certs-ca-bundle\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.098245 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.098217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-secret-telemeter-client\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.098405 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.098381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.098606 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.098586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-federate-client-tls\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.098676 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.098658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/df4f2af2-d944-4a00-b116-126b71c159dc-telemeter-client-tls\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.108357 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.108333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn6js\" (UniqueName: \"kubernetes.io/projected/df4f2af2-d944-4a00-b116-126b71c159dc-kube-api-access-mn6js\") pod \"telemeter-client-5f55d95dbb-jfdvg\" (UID: \"df4f2af2-d944-4a00-b116-126b71c159dc\") " pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.159274 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.159250 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" Apr 23 17:55:32.287342 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.287311 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg"] Apr 23 17:55:32.290076 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:32.290046 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4f2af2_d944_4a00_b116_126b71c159dc.slice/crio-f6f44fc521b48568d65a3c2cfbdb45f1582f5179e7b297dbf4a4fada47eabf8b WatchSource:0}: Error finding container f6f44fc521b48568d65a3c2cfbdb45f1582f5179e7b297dbf4a4fada47eabf8b: Status 404 returned error can't find the container with id f6f44fc521b48568d65a3c2cfbdb45f1582f5179e7b297dbf4a4fada47eabf8b Apr 23 17:55:32.481548 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.481507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"ed0383c0fc69115c9d55accfcb439606d9ab0efa7cc00fee15da44e1999734c0"} Apr 23 17:55:32.481548 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.481555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"48fdb71f637eba7af9f1061a2b8e66bdf633c5055175966fd8390fc8467dbe1b"} Apr 23 17:55:32.481779 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.481568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"985232a4632857e0e3f321565ea9e55b1bfdfb5624b35cd5d12a28fa1111e62c"} Apr 23 17:55:32.483283 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.483202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-667bd85767-4v446" event={"ID":"eec30db5-c0af-4aa9-b807-89e196d3f094","Type":"ContainerStarted","Data":"207c22ffda8ea398a04e138c4c3ecad90a438e96bd0b30e4320163d3d31d6ca0"} Apr 23 17:55:32.484435 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:32.484409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" event={"ID":"df4f2af2-d944-4a00-b116-126b71c159dc","Type":"ContainerStarted","Data":"f6f44fc521b48568d65a3c2cfbdb45f1582f5179e7b297dbf4a4fada47eabf8b"} Apr 23 17:55:33.490420 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:33.490387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"dab00403edb41ec66c7390199fe75eea887ddedadb83c31e009eeac2b4782b51"} Apr 23 17:55:33.490763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:33.490429 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"cbdd1e91da44aca0bc90b77a95ffe292e0a13768f71500adc55a3ea217feab6e"} Apr 23 17:55:33.490763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:33.490440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" event={"ID":"e108b2f0-3753-4733-bedd-f9e769bbb345","Type":"ContainerStarted","Data":"f2c39adfd33304c42fa8ed9bbb1ef3a01f9565636ec61e2df66d50f2cbbfaac0"} Apr 23 17:55:33.490763 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:33.490636 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:33.523541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:33.523498 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" podStartSLOduration=1.9645199519999998 podStartE2EDuration="4.523479909s" podCreationTimestamp="2026-04-23 17:55:29 +0000 UTC" firstStartedPulling="2026-04-23 17:55:30.180864952 +0000 UTC m=+117.726840301" lastFinishedPulling="2026-04-23 17:55:32.739824901 +0000 UTC m=+120.285800258" observedRunningTime="2026-04-23 17:55:33.521450704 +0000 UTC m=+121.067426074" watchObservedRunningTime="2026-04-23 17:55:33.523479909 +0000 UTC m=+121.069455277" Apr 23 17:55:34.495409 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:34.495374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-667bd85767-4v446" event={"ID":"eec30db5-c0af-4aa9-b807-89e196d3f094","Type":"ContainerStarted","Data":"80b2dd562e488e2e195e9fe1bd55ee2c5b5cc37c7905274c303c73cda15ad464"} Apr 23 17:55:34.496850 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:34.496822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" event={"ID":"df4f2af2-d944-4a00-b116-126b71c159dc","Type":"ContainerStarted","Data":"e1e61f2c321ad9d7574d690e71d56862f16e253a198889267d027127e2528c17"} Apr 23 17:55:34.518761 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:34.518721 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-667bd85767-4v446" podStartSLOduration=2.017631457 podStartE2EDuration="3.518709049s" podCreationTimestamp="2026-04-23 17:55:31 +0000 UTC" firstStartedPulling="2026-04-23 17:55:31.947252954 +0000 UTC m=+119.493228300" lastFinishedPulling="2026-04-23 17:55:33.448330547 +0000 UTC m=+120.994305892" observedRunningTime="2026-04-23 17:55:34.516785753 +0000 UTC m=+122.062761156" watchObservedRunningTime="2026-04-23 17:55:34.518709049 +0000 UTC m=+122.064684416" Apr 23 17:55:35.502004 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:35.501965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" event={"ID":"df4f2af2-d944-4a00-b116-126b71c159dc","Type":"ContainerStarted","Data":"5e201d103249228330b4dc2bbc5db211fe23d55d38ea6b26d6b0c927404cdd0c"} Apr 23 17:55:35.502004 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:35.502010 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" event={"ID":"df4f2af2-d944-4a00-b116-126b71c159dc","Type":"ContainerStarted","Data":"99394dd553bc66534cacd915d60ffd1f13e4c9d20375009cdbfa6bf4f1db1a1a"} Apr 23 17:55:38.427519 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:38.427490 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c45cc5f59-s6pc2" Apr 23 17:55:38.452270 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:38.452218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5f55d95dbb-jfdvg" podStartSLOduration=4.644858702 podStartE2EDuration="7.452203233s" podCreationTimestamp="2026-04-23 17:55:31 +0000 UTC" firstStartedPulling="2026-04-23 17:55:32.292187647 +0000 UTC m=+119.838163003" lastFinishedPulling="2026-04-23 17:55:35.099532187 +0000 UTC m=+122.645507534" observedRunningTime="2026-04-23 17:55:35.540334348 +0000 UTC m=+123.086309715" watchObservedRunningTime="2026-04-23 17:55:38.452203233 +0000 UTC m=+125.998178599" Apr 23 17:55:39.057279 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:39.057247 2575 scope.go:117] "RemoveContainer" containerID="bddb19761c761e79e1cf3aca7e9b75ff4ff4e482b2492117772446f5b21d1e94" Apr 23 17:55:39.504485 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:39.504460 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7556c7f998-x77kb" Apr 23 17:55:39.515146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:39.515125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 17:55:39.515283 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:39.515186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" event={"ID":"50ae4d89-cf03-479c-935d-c2b46bb0082b","Type":"ContainerStarted","Data":"a339ea60d202da79d50e01972b856d04feade9ad0700a8325975480a25be4f4c"} Apr 23 17:55:39.515504 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:39.515471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:55:39.551541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:39.551487 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" podStartSLOduration=53.056217049 podStartE2EDuration="56.551470142s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:54:43.939248564 +0000 UTC m=+71.485223910" lastFinishedPulling="2026-04-23 17:54:47.434501642 +0000 UTC m=+74.980477003" observedRunningTime="2026-04-23 17:55:39.550080662 +0000 UTC m=+127.096056029" watchObservedRunningTime="2026-04-23 17:55:39.551470142 +0000 UTC m=+127.097445509" Apr 23 17:55:40.070509 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.070477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-dznpd" Apr 23 17:55:40.265345 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.265309 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-zr7bt"] Apr 23 17:55:40.268541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.268518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:40.272353 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.272325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4zt5q\"" Apr 23 17:55:40.272974 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.272958 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 17:55:40.274034 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.274012 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 17:55:40.286673 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.286653 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zr7bt"] Apr 23 17:55:40.367873 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.367785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qxn\" (UniqueName: \"kubernetes.io/projected/52c4d929-5543-48fc-9138-b1149888b4e1-kube-api-access-h4qxn\") pod \"downloads-6bcc868b7-zr7bt\" (UID: \"52c4d929-5543-48fc-9138-b1149888b4e1\") " pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:40.468732 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.468683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qxn\" (UniqueName: \"kubernetes.io/projected/52c4d929-5543-48fc-9138-b1149888b4e1-kube-api-access-h4qxn\") pod \"downloads-6bcc868b7-zr7bt\" (UID: \"52c4d929-5543-48fc-9138-b1149888b4e1\") " pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:40.477771 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.477745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qxn\" (UniqueName: \"kubernetes.io/projected/52c4d929-5543-48fc-9138-b1149888b4e1-kube-api-access-h4qxn\") pod \"downloads-6bcc868b7-zr7bt\" (UID: \"52c4d929-5543-48fc-9138-b1149888b4e1\") " pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:40.577903 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.577852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:40.715719 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.715667 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zr7bt"] Apr 23 17:55:40.717810 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:40.717784 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c4d929_5543_48fc_9138_b1149888b4e1.slice/crio-e7d710f37958db16ac9ef2a9243e94a871c43810be1d868ef64785cae5f9a862 WatchSource:0}: Error finding container e7d710f37958db16ac9ef2a9243e94a871c43810be1d868ef64785cae5f9a862: Status 404 returned error can't find the container with id e7d710f37958db16ac9ef2a9243e94a871c43810be1d868ef64785cae5f9a862 Apr 23 17:55:40.883008 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:40.882916 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" podUID="d439c95a-c193-4b86-a761-5d34ccc0e57d" containerName="registry" containerID="cri-o://706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0" gracePeriod=30 Apr 23 17:55:41.138712 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.138643 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:55:41.176030 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.175997 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-certificates\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176205 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176053 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-trusted-ca\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176205 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176081 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d439c95a-c193-4b86-a761-5d34ccc0e57d-ca-trust-extracted\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176205 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176122 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176205 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176175 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-bound-sa-token\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176419 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176218 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-image-registry-private-configuration\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176419 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176274 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-installation-pull-secrets\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176419 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176300 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c792\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-kube-api-access-2c792\") pod \"d439c95a-c193-4b86-a761-5d34ccc0e57d\" (UID: \"d439c95a-c193-4b86-a761-5d34ccc0e57d\") " Apr 23 17:55:41.176619 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176594 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:41.177003 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.176965 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:41.179670 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.179639 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:55:41.179772 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.179721 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:55:41.179980 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.179953 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-kube-api-access-2c792" (OuterVolumeSpecName: "kube-api-access-2c792") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "kube-api-access-2c792". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:55:41.180371 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.180244 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:55:41.180371 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.180250 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:55:41.186766 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.186742 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d439c95a-c193-4b86-a761-5d34ccc0e57d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d439c95a-c193-4b86-a761-5d34ccc0e57d" (UID: "d439c95a-c193-4b86-a761-5d34ccc0e57d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:55:41.277244 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277207 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-trusted-ca\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277244 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277241 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d439c95a-c193-4b86-a761-5d34ccc0e57d-ca-trust-extracted\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277258 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277273 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-bound-sa-token\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277288 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-image-registry-private-configuration\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277303 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d439c95a-c193-4b86-a761-5d34ccc0e57d-installation-pull-secrets\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277318 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2c792\" (UniqueName: \"kubernetes.io/projected/d439c95a-c193-4b86-a761-5d34ccc0e57d-kube-api-access-2c792\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.277444 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.277334 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d439c95a-c193-4b86-a761-5d34ccc0e57d-registry-certificates\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:55:41.522406 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.522365 2575 generic.go:358] "Generic (PLEG): container finished" podID="d439c95a-c193-4b86-a761-5d34ccc0e57d" containerID="706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0" exitCode=0 Apr 23 17:55:41.522575 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.522449 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" Apr 23 17:55:41.522575 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.522461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" event={"ID":"d439c95a-c193-4b86-a761-5d34ccc0e57d","Type":"ContainerDied","Data":"706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0"} Apr 23 17:55:41.522575 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.522497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-669c6dbffc-ntp4x" event={"ID":"d439c95a-c193-4b86-a761-5d34ccc0e57d","Type":"ContainerDied","Data":"7ea3230e5212acd692dbdaade9e7db0fe3dbfb3d2389220d4ab2e68cdc3d68d6"} Apr 23 17:55:41.522575 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.522517 2575 scope.go:117] "RemoveContainer" containerID="706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0" Apr 23 17:55:41.524283 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.524254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zr7bt" event={"ID":"52c4d929-5543-48fc-9138-b1149888b4e1","Type":"ContainerStarted","Data":"e7d710f37958db16ac9ef2a9243e94a871c43810be1d868ef64785cae5f9a862"} Apr 23 17:55:41.532068 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.532048 2575 scope.go:117] "RemoveContainer" containerID="706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0" Apr 23 17:55:41.532418 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:55:41.532389 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0\": container with ID starting with 706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0 not found: ID does not exist" containerID="706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0" Apr 23 17:55:41.532505 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.532430 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0"} err="failed to get container status \"706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0\": rpc error: code = NotFound desc = could not find container \"706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0\": container with ID starting with 706c9a57720e9cbd9a60914ae55b8c1c51723197c01153957b40e4cdba9d74f0 not found: ID does not exist" Apr 23 17:55:41.546896 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.546862 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-669c6dbffc-ntp4x"] Apr 23 17:55:41.550978 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:41.550958 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-669c6dbffc-ntp4x"] Apr 23 17:55:42.792080 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:42.792044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:55:42.794966 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:42.794939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6201ae7f-dbb7-4347-a698-89a65766225e-metrics-certs\") pod \"network-metrics-daemon-6lhps\" (UID: \"6201ae7f-dbb7-4347-a698-89a65766225e\") " pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:55:42.872549 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:42.872510 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x582n\"" Apr 23 17:55:42.880688 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:42.880661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6lhps" Apr 23 17:55:43.036916 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:43.031144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6lhps"] Apr 23 17:55:43.061651 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:43.061580 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d439c95a-c193-4b86-a761-5d34ccc0e57d" path="/var/lib/kubelet/pods/d439c95a-c193-4b86-a761-5d34ccc0e57d/volumes" Apr 23 17:55:43.533251 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:43.533211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6lhps" event={"ID":"6201ae7f-dbb7-4347-a698-89a65766225e","Type":"ContainerStarted","Data":"7643d62c48425ed935b6e706e8cc4530db6226461eadb465a147f95dece86f49"} Apr 23 17:55:44.539690 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:44.539651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6lhps" event={"ID":"6201ae7f-dbb7-4347-a698-89a65766225e","Type":"ContainerStarted","Data":"0c66e6ba9132eb32c9d09d04c20b83340eb30e91914d1536a2e3c1880ae8c670"} Apr 23 17:55:44.539690 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:44.539695 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6lhps" event={"ID":"6201ae7f-dbb7-4347-a698-89a65766225e","Type":"ContainerStarted","Data":"822ac1824463494c4671a153f61472dab64d620a205b9a99e0e78df79fd5c680"} Apr 23 17:55:44.563034 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:44.562980 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6lhps" podStartSLOduration=130.468722408 podStartE2EDuration="2m11.562964544s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:55:43.038725973 +0000 UTC m=+130.584701319" lastFinishedPulling="2026-04-23 17:55:44.132968103 +0000 UTC m=+131.678943455" observedRunningTime="2026-04-23 17:55:44.560284488 +0000 UTC m=+132.106259855" watchObservedRunningTime="2026-04-23 17:55:44.562964544 +0000 UTC m=+132.108939926" Apr 23 17:55:49.953083 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.953045 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b7bcc5b56-hgf74"] Apr 23 17:55:49.953661 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.953382 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d439c95a-c193-4b86-a761-5d34ccc0e57d" containerName="registry" Apr 23 17:55:49.953661 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.953394 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d439c95a-c193-4b86-a761-5d34ccc0e57d" containerName="registry" Apr 23 17:55:49.953661 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.953489 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d439c95a-c193-4b86-a761-5d34ccc0e57d" containerName="registry" Apr 23 17:55:49.959228 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.959202 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:49.962994 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.962969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 17:55:49.963119 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.963028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 17:55:49.963250 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.963235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 17:55:49.963754 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.963731 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 17:55:49.963901 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.963780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 17:55:49.963901 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.963827 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-t8hv9\"" Apr 23 17:55:49.973363 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:49.973329 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b7bcc5b56-hgf74"] Apr 23 17:55:50.059902 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.059847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-oauth-config\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.060086 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.059912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-service-ca\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.060086 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.059953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5zfq\" (UniqueName: \"kubernetes.io/projected/db946b68-d8c1-4023-9dce-0f5fd239c4fa-kube-api-access-r5zfq\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.060086 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.059984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-serving-cert\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.060086 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.060070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-config\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.060293 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.060111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-oauth-serving-cert\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.160786 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.160749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-serving-cert\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.160827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-config\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.160851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-oauth-serving-cert\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.160907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-oauth-config\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.160930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-service-ca\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.160963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5zfq\" (UniqueName: \"kubernetes.io/projected/db946b68-d8c1-4023-9dce-0f5fd239c4fa-kube-api-access-r5zfq\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161684 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.161656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-config\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161801 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.161738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-service-ca\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.161870 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.161854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-oauth-serving-cert\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.163651 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.163631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-oauth-config\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.163743 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.163671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-serving-cert\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.170871 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.170847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5zfq\" (UniqueName: \"kubernetes.io/projected/db946b68-d8c1-4023-9dce-0f5fd239c4fa-kube-api-access-r5zfq\") pod \"console-b7bcc5b56-hgf74\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:50.270747 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:50.270662 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:55:51.367074 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:51.367033 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:51.367527 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:51.367093 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:55:56.256801 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:56.256776 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b7bcc5b56-hgf74"] Apr 23 17:55:56.262440 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:56.262411 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb946b68_d8c1_4023_9dce_0f5fd239c4fa.slice/crio-33f3f2a5d7c64efe65daf959c7f194b02508d499ec2c2b27284dfea61d9f0e6d WatchSource:0}: Error finding container 33f3f2a5d7c64efe65daf959c7f194b02508d499ec2c2b27284dfea61d9f0e6d: Status 404 returned error can't find the container with id 33f3f2a5d7c64efe65daf959c7f194b02508d499ec2c2b27284dfea61d9f0e6d Apr 23 17:55:56.585547 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:56.585504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zr7bt" event={"ID":"52c4d929-5543-48fc-9138-b1149888b4e1","Type":"ContainerStarted","Data":"ea234aca10e8bad45e2d2eae968ecebc5df8fc1574425aba649a6d12a5a65253"} Apr 23 17:55:56.585818 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:56.585786 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:56.586820 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:56.586791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7bcc5b56-hgf74" event={"ID":"db946b68-d8c1-4023-9dce-0f5fd239c4fa","Type":"ContainerStarted","Data":"33f3f2a5d7c64efe65daf959c7f194b02508d499ec2c2b27284dfea61d9f0e6d"} Apr 23 17:55:56.606193 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:56.606157 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-zr7bt" Apr 23 17:55:56.608360 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:56.608310 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-zr7bt" podStartSLOduration=1.115311597 podStartE2EDuration="16.608294358s" podCreationTimestamp="2026-04-23 17:55:40 +0000 UTC" firstStartedPulling="2026-04-23 17:55:40.719833291 +0000 UTC m=+128.265808636" lastFinishedPulling="2026-04-23 17:55:56.212816044 +0000 UTC m=+143.758791397" observedRunningTime="2026-04-23 17:55:56.607468304 +0000 UTC m=+144.153443670" watchObservedRunningTime="2026-04-23 17:55:56.608294358 +0000 UTC m=+144.154269726" Apr 23 17:55:58.377410 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.375935 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-544bbb876d-5qkbv"] Apr 23 17:55:58.413525 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.413424 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-544bbb876d-5qkbv"] Apr 23 17:55:58.413712 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.413644 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.421924 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.421876 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 17:55:58.538390 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-oauth-serving-cert\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.538390 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-oauth-config\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.538628 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-service-ca\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.538628 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-trusted-ca-bundle\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.538728 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-console-config\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.538773 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-serving-cert\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.538823 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.538781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2km2d\" (UniqueName: \"kubernetes.io/projected/90b19321-d185-4d33-bd04-38466e5f290e-kube-api-access-2km2d\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.596665 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.596618 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4825f9f-c326-4675-901b-635a9bb75ddc" containerID="7f72b457ebcd9be371a2eefe5ff8ed9fcec84dfea8cea5d9ac19806dfb21cd01" exitCode=0 Apr 23 17:55:58.597009 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.596855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" event={"ID":"c4825f9f-c326-4675-901b-635a9bb75ddc","Type":"ContainerDied","Data":"7f72b457ebcd9be371a2eefe5ff8ed9fcec84dfea8cea5d9ac19806dfb21cd01"} Apr 23 17:55:58.598060 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.597721 2575 scope.go:117] "RemoveContainer" containerID="7f72b457ebcd9be371a2eefe5ff8ed9fcec84dfea8cea5d9ac19806dfb21cd01" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.639978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-console-config\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.640038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-serving-cert\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.640070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2km2d\" (UniqueName: \"kubernetes.io/projected/90b19321-d185-4d33-bd04-38466e5f290e-kube-api-access-2km2d\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.640142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-oauth-serving-cert\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.640170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-oauth-config\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.640204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-service-ca\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.640228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-trusted-ca-bundle\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.641299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-trusted-ca-bundle\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.642262 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.641850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-console-config\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.643365 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.643322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-oauth-serving-cert\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.644102 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.644075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-service-ca\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.646945 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.646922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-serving-cert\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.647751 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.647711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-oauth-config\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.660200 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.660177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2km2d\" (UniqueName: \"kubernetes.io/projected/90b19321-d185-4d33-bd04-38466e5f290e-kube-api-access-2km2d\") pod \"console-544bbb876d-5qkbv\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:58.729755 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:58.729715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:55:59.601748 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:59.601714 2575 generic.go:358] "Generic (PLEG): container finished" podID="e7f277ff-c690-4837-8465-5d845f4c966b" containerID="69f0125e720380d608b09b18ab8eec2a2d64b98eb08fa2be011b332d003cdba1" exitCode=0 Apr 23 17:55:59.602164 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:59.601821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" event={"ID":"e7f277ff-c690-4837-8465-5d845f4c966b","Type":"ContainerDied","Data":"69f0125e720380d608b09b18ab8eec2a2d64b98eb08fa2be011b332d003cdba1"} Apr 23 17:55:59.602270 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:59.602250 2575 scope.go:117] "RemoveContainer" containerID="69f0125e720380d608b09b18ab8eec2a2d64b98eb08fa2be011b332d003cdba1" Apr 23 17:55:59.720284 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:55:59.720220 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-544bbb876d-5qkbv"] Apr 23 17:55:59.721664 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:55:59.721639 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b19321_d185_4d33_bd04_38466e5f290e.slice/crio-3b15a35b690bfd5bf7ff1080a0445d111bba333d78edc0df6cd68dc91c04a3ea WatchSource:0}: Error finding container 3b15a35b690bfd5bf7ff1080a0445d111bba333d78edc0df6cd68dc91c04a3ea: Status 404 returned error can't find the container with id 3b15a35b690bfd5bf7ff1080a0445d111bba333d78edc0df6cd68dc91c04a3ea Apr 23 17:56:00.607692 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.607606 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k5xtc" event={"ID":"c4825f9f-c326-4675-901b-635a9bb75ddc","Type":"ContainerStarted","Data":"8a5db37d0fb75ddb985dd85a3de8415e1e50785b4e8ffb8096a29aec05e39ecd"} Apr 23 17:56:00.609658 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.609630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-69wxs" event={"ID":"e7f277ff-c690-4837-8465-5d845f4c966b","Type":"ContainerStarted","Data":"70d87957832ae75fa2950102dbb7c936de47514175805d90d5518bd355a6ace7"} Apr 23 17:56:00.611504 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.611482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7bcc5b56-hgf74" event={"ID":"db946b68-d8c1-4023-9dce-0f5fd239c4fa","Type":"ContainerStarted","Data":"6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d"} Apr 23 17:56:00.617906 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.617834 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-544bbb876d-5qkbv" event={"ID":"90b19321-d185-4d33-bd04-38466e5f290e","Type":"ContainerStarted","Data":"a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48"} Apr 23 17:56:00.618015 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.617913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-544bbb876d-5qkbv" event={"ID":"90b19321-d185-4d33-bd04-38466e5f290e","Type":"ContainerStarted","Data":"3b15a35b690bfd5bf7ff1080a0445d111bba333d78edc0df6cd68dc91c04a3ea"} Apr 23 17:56:00.699475 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.699414 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-544bbb876d-5qkbv" podStartSLOduration=2.114422165 podStartE2EDuration="2.699393984s" podCreationTimestamp="2026-04-23 17:55:58 +0000 UTC" firstStartedPulling="2026-04-23 17:55:59.723802955 +0000 UTC m=+147.269778302" lastFinishedPulling="2026-04-23 17:56:00.308774769 +0000 UTC m=+147.854750121" observedRunningTime="2026-04-23 17:56:00.697538656 +0000 UTC m=+148.243514023" watchObservedRunningTime="2026-04-23 17:56:00.699393984 +0000 UTC m=+148.245369352" Apr 23 17:56:00.718412 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:00.718342 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b7bcc5b56-hgf74" podStartSLOduration=7.934586222 podStartE2EDuration="11.718307371s" podCreationTimestamp="2026-04-23 17:55:49 +0000 UTC" firstStartedPulling="2026-04-23 17:55:56.264408096 +0000 UTC m=+143.810383456" lastFinishedPulling="2026-04-23 17:56:00.04812926 +0000 UTC m=+147.594104605" observedRunningTime="2026-04-23 17:56:00.717820916 +0000 UTC m=+148.263796284" watchObservedRunningTime="2026-04-23 17:56:00.718307371 +0000 UTC m=+148.264282740" Apr 23 17:56:03.630762 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:03.630728 2575 generic.go:358] "Generic (PLEG): container finished" podID="ffa5d0d2-4b4f-4472-b414-6aefb709735c" containerID="8ef962fc19d5d548ce3d46afb727213815d89516597e09072244b331a60422ba" exitCode=0 Apr 23 17:56:03.631265 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:03.630811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" event={"ID":"ffa5d0d2-4b4f-4472-b414-6aefb709735c","Type":"ContainerDied","Data":"8ef962fc19d5d548ce3d46afb727213815d89516597e09072244b331a60422ba"} Apr 23 17:56:03.631332 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:03.631299 2575 scope.go:117] "RemoveContainer" containerID="8ef962fc19d5d548ce3d46afb727213815d89516597e09072244b331a60422ba" Apr 23 17:56:04.636832 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:04.636794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zmmnp" event={"ID":"ffa5d0d2-4b4f-4472-b414-6aefb709735c","Type":"ContainerStarted","Data":"b1d498e6f167b56fe87891d770ed159202b87f4488b0ab6e33055271903c2437"} Apr 23 17:56:05.800844 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:05.800813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qrm2m_cbf092c0-733e-4d6b-a240-9d95ac93022a/cluster-monitoring-operator/0.log" Apr 23 17:56:06.598237 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:06.598169 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-667bd85767-4v446_eec30db5-c0af-4aa9-b807-89e196d3f094/metrics-server/0.log" Apr 23 17:56:07.598489 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:07.598431 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xnt4q_93846088-277d-4894-b6ef-cc87f01ad6fa/init-textfile/0.log" Apr 23 17:56:07.799594 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:07.799560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xnt4q_93846088-277d-4894-b6ef-cc87f01ad6fa/node-exporter/0.log" Apr 23 17:56:07.998514 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:07.998486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xnt4q_93846088-277d-4894-b6ef-cc87f01ad6fa/kube-rbac-proxy/0.log" Apr 23 17:56:08.730177 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:08.730138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:56:08.730177 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:08.730180 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:56:08.734829 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:08.734807 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:56:09.655113 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:09.655087 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:56:09.705036 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:09.705007 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b7bcc5b56-hgf74"] Apr 23 17:56:10.271357 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:10.271323 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:56:10.799806 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:10.799772 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f55d95dbb-jfdvg_df4f2af2-d944-4a00-b116-126b71c159dc/telemeter-client/0.log" Apr 23 17:56:10.997900 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:10.997846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f55d95dbb-jfdvg_df4f2af2-d944-4a00-b116-126b71c159dc/reload/0.log" Apr 23 17:56:11.198186 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.198146 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f55d95dbb-jfdvg_df4f2af2-d944-4a00-b116-126b71c159dc/kube-rbac-proxy/0.log" Apr 23 17:56:11.372392 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.372364 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:56:11.376489 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.376469 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-667bd85767-4v446" Apr 23 17:56:11.397520 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.397475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/thanos-query/0.log" Apr 23 17:56:11.597668 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.597552 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy-web/0.log" Apr 23 17:56:11.798692 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.798624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy/0.log" Apr 23 17:56:11.997674 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:11.997644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/prom-label-proxy/0.log" Apr 23 17:56:12.199151 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:12.199117 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy-rules/0.log" Apr 23 17:56:12.398094 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:12.397984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy-metrics/0.log" Apr 23 17:56:12.797838 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:12.797809 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 17:56:13.003825 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:13.003796 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/3.log" Apr 23 17:56:13.198937 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:13.198903 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-544bbb876d-5qkbv_90b19321-d185-4d33-bd04-38466e5f290e/console/0.log" Apr 23 17:56:13.400194 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:13.400160 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b7bcc5b56-hgf74_db946b68-d8c1-4023-9dce-0f5fd239c4fa/console/0.log" Apr 23 17:56:13.600691 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:13.600596 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zr7bt_52c4d929-5543-48fc-9138-b1149888b4e1/download-server/0.log" Apr 23 17:56:34.725591 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:34.725527 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b7bcc5b56-hgf74" podUID="db946b68-d8c1-4023-9dce-0f5fd239c4fa" containerName="console" containerID="cri-o://6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d" gracePeriod=15 Apr 23 17:56:34.990290 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:34.990269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b7bcc5b56-hgf74_db946b68-d8c1-4023-9dce-0f5fd239c4fa/console/0.log" Apr 23 17:56:34.990411 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:34.990326 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:56:35.059179 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059149 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-config\") pod \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " Apr 23 17:56:35.059373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059210 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-oauth-serving-cert\") pod \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " Apr 23 17:56:35.059373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059239 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-oauth-config\") pod \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " Apr 23 17:56:35.059373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059255 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-serving-cert\") pod \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " Apr 23 17:56:35.059373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059289 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5zfq\" (UniqueName: \"kubernetes.io/projected/db946b68-d8c1-4023-9dce-0f5fd239c4fa-kube-api-access-r5zfq\") pod \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " Apr 23 17:56:35.059373 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059350 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-service-ca\") pod \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\" (UID: \"db946b68-d8c1-4023-9dce-0f5fd239c4fa\") " Apr 23 17:56:35.059615 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059558 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-config" (OuterVolumeSpecName: "console-config") pod "db946b68-d8c1-4023-9dce-0f5fd239c4fa" (UID: "db946b68-d8c1-4023-9dce-0f5fd239c4fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:35.059682 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059654 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db946b68-d8c1-4023-9dce-0f5fd239c4fa" (UID: "db946b68-d8c1-4023-9dce-0f5fd239c4fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:35.059902 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.059851 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "db946b68-d8c1-4023-9dce-0f5fd239c4fa" (UID: "db946b68-d8c1-4023-9dce-0f5fd239c4fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:35.062197 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.062171 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db946b68-d8c1-4023-9dce-0f5fd239c4fa" (UID: "db946b68-d8c1-4023-9dce-0f5fd239c4fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:35.062197 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.062186 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db946b68-d8c1-4023-9dce-0f5fd239c4fa-kube-api-access-r5zfq" (OuterVolumeSpecName: "kube-api-access-r5zfq") pod "db946b68-d8c1-4023-9dce-0f5fd239c4fa" (UID: "db946b68-d8c1-4023-9dce-0f5fd239c4fa"). InnerVolumeSpecName "kube-api-access-r5zfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:35.062328 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.062212 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db946b68-d8c1-4023-9dce-0f5fd239c4fa" (UID: "db946b68-d8c1-4023-9dce-0f5fd239c4fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:35.160864 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.160833 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:56:35.160864 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.160857 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-oauth-serving-cert\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:56:35.160864 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.160870 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-oauth-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:56:35.161099 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.160913 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db946b68-d8c1-4023-9dce-0f5fd239c4fa-console-serving-cert\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:56:35.161099 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.160925 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5zfq\" (UniqueName: \"kubernetes.io/projected/db946b68-d8c1-4023-9dce-0f5fd239c4fa-kube-api-access-r5zfq\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:56:35.161099 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.160934 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db946b68-d8c1-4023-9dce-0f5fd239c4fa-service-ca\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:56:35.737321 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.737294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b7bcc5b56-hgf74_db946b68-d8c1-4023-9dce-0f5fd239c4fa/console/0.log" Apr 23 17:56:35.737710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.737335 2575 generic.go:358] "Generic (PLEG): container finished" podID="db946b68-d8c1-4023-9dce-0f5fd239c4fa" containerID="6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d" exitCode=2 Apr 23 17:56:35.737710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.737384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7bcc5b56-hgf74" event={"ID":"db946b68-d8c1-4023-9dce-0f5fd239c4fa","Type":"ContainerDied","Data":"6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d"} Apr 23 17:56:35.737710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.737406 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7bcc5b56-hgf74" Apr 23 17:56:35.737710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.737412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7bcc5b56-hgf74" event={"ID":"db946b68-d8c1-4023-9dce-0f5fd239c4fa","Type":"ContainerDied","Data":"33f3f2a5d7c64efe65daf959c7f194b02508d499ec2c2b27284dfea61d9f0e6d"} Apr 23 17:56:35.737710 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.737434 2575 scope.go:117] "RemoveContainer" containerID="6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d" Apr 23 17:56:35.746442 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.746373 2575 scope.go:117] "RemoveContainer" containerID="6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d" Apr 23 17:56:35.747609 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:56:35.747585 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d\": container with ID starting with 6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d not found: ID does not exist" containerID="6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d" Apr 23 17:56:35.747704 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.747617 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d"} err="failed to get container status \"6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d\": rpc error: code = NotFound desc = could not find container \"6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d\": container with ID starting with 6420fb4a7f92044fd1f104b91b3d135e3371886a62d8af9859c4542ef2f8ce7d not found: ID does not exist" Apr 23 17:56:35.762207 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.762180 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b7bcc5b56-hgf74"] Apr 23 17:56:35.768761 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:35.768741 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b7bcc5b56-hgf74"] Apr 23 17:56:37.063502 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:37.063467 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db946b68-d8c1-4023-9dce-0f5fd239c4fa" path="/var/lib/kubelet/pods/db946b68-d8c1-4023-9dce-0f5fd239c4fa/volumes" Apr 23 17:56:57.640034 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.639998 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84bf9f9f94-58fqq"] Apr 23 17:56:57.640541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.640377 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db946b68-d8c1-4023-9dce-0f5fd239c4fa" containerName="console" Apr 23 17:56:57.640541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.640389 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="db946b68-d8c1-4023-9dce-0f5fd239c4fa" containerName="console" Apr 23 17:56:57.640541 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.640460 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="db946b68-d8c1-4023-9dce-0f5fd239c4fa" containerName="console" Apr 23 17:56:57.644785 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.644763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.668283 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.668213 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bf9f9f94-58fqq"] Apr 23 17:56:57.754851 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.754817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-trusted-ca-bundle\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.754851 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.754859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlsdz\" (UniqueName: \"kubernetes.io/projected/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-kube-api-access-tlsdz\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.755084 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.754898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-config\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.755084 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.754929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-oauth-serving-cert\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.755084 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.754994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-oauth-config\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.755084 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.755051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-service-ca\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.755084 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.755081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-serving-cert\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856200 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-service-ca\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856376 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-serving-cert\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856376 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-trusted-ca-bundle\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856376 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlsdz\" (UniqueName: \"kubernetes.io/projected/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-kube-api-access-tlsdz\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856376 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-config\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856621 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-oauth-serving-cert\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.856621 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.856556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-oauth-config\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.857127 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.857026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-service-ca\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.857217 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.857152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-trusted-ca-bundle\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.857217 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.857204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-oauth-serving-cert\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.857301 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.857102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-config\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.858953 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.858935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-serving-cert\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.858993 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.858938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-console-oauth-config\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.864385 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.864367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlsdz\" (UniqueName: \"kubernetes.io/projected/69bbed3b-a24c-4cea-b6ad-f05dd97236eb-kube-api-access-tlsdz\") pod \"console-84bf9f9f94-58fqq\" (UID: \"69bbed3b-a24c-4cea-b6ad-f05dd97236eb\") " pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:57.957764 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:57.957731 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:56:58.101041 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:58.100958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bf9f9f94-58fqq"] Apr 23 17:56:58.103960 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:56:58.103928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69bbed3b_a24c_4cea_b6ad_f05dd97236eb.slice/crio-f3024b07392532131925c65fa8fb8b41bc55bfa487950867a64baec814deedd1 WatchSource:0}: Error finding container f3024b07392532131925c65fa8fb8b41bc55bfa487950867a64baec814deedd1: Status 404 returned error can't find the container with id f3024b07392532131925c65fa8fb8b41bc55bfa487950867a64baec814deedd1 Apr 23 17:56:58.804181 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:58.804145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bf9f9f94-58fqq" event={"ID":"69bbed3b-a24c-4cea-b6ad-f05dd97236eb","Type":"ContainerStarted","Data":"95c1b5254a1762e6d95b6e06a224b62d03f0e37418be884a4c0ee86c83e75781"} Apr 23 17:56:58.804181 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:58.804188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bf9f9f94-58fqq" event={"ID":"69bbed3b-a24c-4cea-b6ad-f05dd97236eb","Type":"ContainerStarted","Data":"f3024b07392532131925c65fa8fb8b41bc55bfa487950867a64baec814deedd1"} Apr 23 17:56:58.822580 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:56:58.822524 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84bf9f9f94-58fqq" podStartSLOduration=1.8225052069999998 podStartE2EDuration="1.822505207s" podCreationTimestamp="2026-04-23 17:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:56:58.82110317 +0000 UTC m=+206.367078535" watchObservedRunningTime="2026-04-23 17:56:58.822505207 +0000 UTC m=+206.368480574" Apr 23 17:57:07.958688 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:07.958649 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:57:07.959128 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:07.958731 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:57:07.963633 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:07.963608 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:57:08.836212 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:08.836186 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84bf9f9f94-58fqq" Apr 23 17:57:08.878531 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:08.878499 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-544bbb876d-5qkbv"] Apr 23 17:57:13.003015 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.002981 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-j744d"] Apr 23 17:57:13.006340 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.006320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.008402 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.008384 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:57:13.012841 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.012817 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j744d"] Apr 23 17:57:13.090658 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.090629 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5623e36-755b-417a-a6af-21d001c67630-kubelet-config\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.090819 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.090671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5623e36-755b-417a-a6af-21d001c67630-original-pull-secret\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.090819 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.090748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5623e36-755b-417a-a6af-21d001c67630-dbus\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.191478 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.191439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5623e36-755b-417a-a6af-21d001c67630-kubelet-config\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.191646 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.191491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5623e36-755b-417a-a6af-21d001c67630-original-pull-secret\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.191646 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.191543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5623e36-755b-417a-a6af-21d001c67630-dbus\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.191646 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.191561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5623e36-755b-417a-a6af-21d001c67630-kubelet-config\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.191832 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.191707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5623e36-755b-417a-a6af-21d001c67630-dbus\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.194005 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.193982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5623e36-755b-417a-a6af-21d001c67630-original-pull-secret\") pod \"global-pull-secret-syncer-j744d\" (UID: \"c5623e36-755b-417a-a6af-21d001c67630\") " pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.316058 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.315985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j744d" Apr 23 17:57:13.440354 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.440231 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j744d"] Apr 23 17:57:13.443319 ip-10-0-143-131 kubenswrapper[2575]: W0423 17:57:13.443293 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5623e36_755b_417a_a6af_21d001c67630.slice/crio-03632c2dbc09204c18ebf24a54c114d724e73637feed5ca39ef471c3fc9e0078 WatchSource:0}: Error finding container 03632c2dbc09204c18ebf24a54c114d724e73637feed5ca39ef471c3fc9e0078: Status 404 returned error can't find the container with id 03632c2dbc09204c18ebf24a54c114d724e73637feed5ca39ef471c3fc9e0078 Apr 23 17:57:13.848211 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:13.848164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j744d" event={"ID":"c5623e36-755b-417a-a6af-21d001c67630","Type":"ContainerStarted","Data":"03632c2dbc09204c18ebf24a54c114d724e73637feed5ca39ef471c3fc9e0078"} Apr 23 17:57:17.861512 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:17.861472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j744d" event={"ID":"c5623e36-755b-417a-a6af-21d001c67630","Type":"ContainerStarted","Data":"8b23aa374aa7ba21cb73a53fefc4ad9d146b77d20019d2020d0b40a96a474a9b"} Apr 23 17:57:17.879961 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:17.879916 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-j744d" podStartSLOduration=2.432108233 podStartE2EDuration="5.879900521s" podCreationTimestamp="2026-04-23 17:57:12 +0000 UTC" firstStartedPulling="2026-04-23 17:57:13.445320124 +0000 UTC m=+220.991295468" lastFinishedPulling="2026-04-23 17:57:16.893112407 +0000 UTC m=+224.439087756" observedRunningTime="2026-04-23 17:57:17.878692817 +0000 UTC m=+225.424668184" watchObservedRunningTime="2026-04-23 17:57:17.879900521 +0000 UTC m=+225.425875885" Apr 23 17:57:33.898971 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:33.898910 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-544bbb876d-5qkbv" podUID="90b19321-d185-4d33-bd04-38466e5f290e" containerName="console" containerID="cri-o://a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48" gracePeriod=15 Apr 23 17:57:34.152832 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.152779 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-544bbb876d-5qkbv_90b19321-d185-4d33-bd04-38466e5f290e/console/0.log" Apr 23 17:57:34.152974 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.152837 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:57:34.274780 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274746 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-service-ca\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.274780 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274780 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2km2d\" (UniqueName: \"kubernetes.io/projected/90b19321-d185-4d33-bd04-38466e5f290e-kube-api-access-2km2d\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.275023 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274815 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-oauth-config\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.275023 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274830 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-serving-cert\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.275023 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274870 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-oauth-serving-cert\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.275023 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274941 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-trusted-ca-bundle\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.275023 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.274956 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-console-config\") pod \"90b19321-d185-4d33-bd04-38466e5f290e\" (UID: \"90b19321-d185-4d33-bd04-38466e5f290e\") " Apr 23 17:57:34.275275 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.275225 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-service-ca" (OuterVolumeSpecName: "service-ca") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:34.275481 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.275449 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-console-config" (OuterVolumeSpecName: "console-config") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:34.275570 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.275488 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:34.275615 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.275586 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:34.277441 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.277420 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:34.277654 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.277636 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:34.277718 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.277643 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b19321-d185-4d33-bd04-38466e5f290e-kube-api-access-2km2d" (OuterVolumeSpecName: "kube-api-access-2km2d") pod "90b19321-d185-4d33-bd04-38466e5f290e" (UID: "90b19321-d185-4d33-bd04-38466e5f290e"). InnerVolumeSpecName "kube-api-access-2km2d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:34.376263 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376230 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-service-ca\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.376263 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376256 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2km2d\" (UniqueName: \"kubernetes.io/projected/90b19321-d185-4d33-bd04-38466e5f290e-kube-api-access-2km2d\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.376263 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376266 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-oauth-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.376458 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376274 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b19321-d185-4d33-bd04-38466e5f290e-console-serving-cert\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.376458 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376283 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-oauth-serving-cert\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.376458 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376292 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-trusted-ca-bundle\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.376458 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.376300 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b19321-d185-4d33-bd04-38466e5f290e-console-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 17:57:34.913727 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.913702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-544bbb876d-5qkbv_90b19321-d185-4d33-bd04-38466e5f290e/console/0.log" Apr 23 17:57:34.914146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.913742 2575 generic.go:358] "Generic (PLEG): container finished" podID="90b19321-d185-4d33-bd04-38466e5f290e" containerID="a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48" exitCode=2 Apr 23 17:57:34.914146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.913806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-544bbb876d-5qkbv" event={"ID":"90b19321-d185-4d33-bd04-38466e5f290e","Type":"ContainerDied","Data":"a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48"} Apr 23 17:57:34.914146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.913838 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-544bbb876d-5qkbv" Apr 23 17:57:34.914146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.913853 2575 scope.go:117] "RemoveContainer" containerID="a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48" Apr 23 17:57:34.914146 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.913841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-544bbb876d-5qkbv" event={"ID":"90b19321-d185-4d33-bd04-38466e5f290e","Type":"ContainerDied","Data":"3b15a35b690bfd5bf7ff1080a0445d111bba333d78edc0df6cd68dc91c04a3ea"} Apr 23 17:57:34.922375 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.922358 2575 scope.go:117] "RemoveContainer" containerID="a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48" Apr 23 17:57:34.922619 ip-10-0-143-131 kubenswrapper[2575]: E0423 17:57:34.922605 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48\": container with ID starting with a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48 not found: ID does not exist" containerID="a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48" Apr 23 17:57:34.922662 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.922627 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48"} err="failed to get container status \"a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48\": rpc error: code = NotFound desc = could not find container \"a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48\": container with ID starting with a69cd5477fae0fc4b4b9f2eece5ca8382f8a990563646cad47cff9ee8ac2fd48 not found: ID does not exist" Apr 23 17:57:34.933712 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.933690 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-544bbb876d-5qkbv"] Apr 23 17:57:34.937506 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:34.937484 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-544bbb876d-5qkbv"] Apr 23 17:57:35.061356 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:57:35.061313 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b19321-d185-4d33-bd04-38466e5f290e" path="/var/lib/kubelet/pods/90b19321-d185-4d33-bd04-38466e5f290e/volumes" Apr 23 17:58:32.964266 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:58:32.964218 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 17:58:32.964802 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:58:32.964352 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 17:58:32.974343 ip-10-0-143-131 kubenswrapper[2575]: I0423 17:58:32.974321 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 18:03:32.986358 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:03:32.986329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:03:32.987042 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:03:32.987024 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:08:33.007452 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:08:33.007423 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:08:33.008825 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:08:33.008802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:10:59.567359 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.567322 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6"] Apr 23 18:10:59.567910 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.567636 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b19321-d185-4d33-bd04-38466e5f290e" containerName="console" Apr 23 18:10:59.567910 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.567648 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b19321-d185-4d33-bd04-38466e5f290e" containerName="console" Apr 23 18:10:59.567910 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.567708 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="90b19321-d185-4d33-bd04-38466e5f290e" containerName="console" Apr 23 18:10:59.570696 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.570678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.572709 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.572690 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:10:59.572834 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.572818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-be430-predictor-serving-cert\"" Apr 23 18:10:59.572957 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.572938 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-be430-kube-rbac-proxy-sar-config\"" Apr 23 18:10:59.573006 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.572960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:10:59.573249 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.573235 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b65vr\"" Apr 23 18:10:59.582196 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.582171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6"] Apr 23 18:10:59.693568 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.693529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.693772 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.693588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv9r7\" (UniqueName: \"kubernetes.io/projected/d9103716-7953-41cb-aa5a-fcd84af6c727-kube-api-access-rv9r7\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.693772 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.693687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9103716-7953-41cb-aa5a-fcd84af6c727-success-200-isvc-be430-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.794848 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.794804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9103716-7953-41cb-aa5a-fcd84af6c727-success-200-isvc-be430-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.795050 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.794923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.795050 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.794990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv9r7\" (UniqueName: \"kubernetes.io/projected/d9103716-7953-41cb-aa5a-fcd84af6c727-kube-api-access-rv9r7\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.795155 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:10:59.795095 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-be430-predictor-serving-cert: secret "success-200-isvc-be430-predictor-serving-cert" not found Apr 23 18:10:59.795553 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:10:59.795530 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls podName:d9103716-7953-41cb-aa5a-fcd84af6c727 nodeName:}" failed. No retries permitted until 2026-04-23 18:11:00.29550266 +0000 UTC m=+1047.841478019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls") pod "success-200-isvc-be430-predictor-7c764dd68b-m4lt6" (UID: "d9103716-7953-41cb-aa5a-fcd84af6c727") : secret "success-200-isvc-be430-predictor-serving-cert" not found Apr 23 18:10:59.798917 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.795861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9103716-7953-41cb-aa5a-fcd84af6c727-success-200-isvc-be430-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.806319 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.806293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv9r7\" (UniqueName: \"kubernetes.io/projected/d9103716-7953-41cb-aa5a-fcd84af6c727-kube-api-access-rv9r7\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:10:59.847749 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.847670 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j"] Apr 23 18:10:59.851181 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.851164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:10:59.853145 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.853123 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-be430-kube-rbac-proxy-sar-config\"" Apr 23 18:10:59.853243 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.853122 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-be430-predictor-serving-cert\"" Apr 23 18:10:59.859861 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.859835 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j"] Apr 23 18:10:59.996689 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.996647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-error-404-isvc-be430-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:10:59.996863 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.996714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-proxy-tls\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:10:59.996863 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:10:59.996792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbj68\" (UniqueName: \"kubernetes.io/projected/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-kube-api-access-qbj68\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.097460 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.097425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbj68\" (UniqueName: \"kubernetes.io/projected/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-kube-api-access-qbj68\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.097676 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.097479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-error-404-isvc-be430-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.097676 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.097542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-proxy-tls\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.098161 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.098109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-error-404-isvc-be430-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.100157 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.100135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-proxy-tls\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.106001 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.105975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbj68\" (UniqueName: \"kubernetes.io/projected/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-kube-api-access-qbj68\") pod \"error-404-isvc-be430-predictor-6bcf86bc7-2wc2j\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.162574 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.162532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:00.293803 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.293715 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j"] Apr 23 18:11:00.296162 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:11:00.296134 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168fefc9_6f1d_42a3_8e99_e9ae9bee9760.slice/crio-3c4c50bf171028d940364bed5af040fc9b45d804d4747df277f9d62b14b3aad2 WatchSource:0}: Error finding container 3c4c50bf171028d940364bed5af040fc9b45d804d4747df277f9d62b14b3aad2: Status 404 returned error can't find the container with id 3c4c50bf171028d940364bed5af040fc9b45d804d4747df277f9d62b14b3aad2 Apr 23 18:11:00.297925 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.297908 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:11:00.299660 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.299627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:11:00.302250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.302231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls\") pod \"success-200-isvc-be430-predictor-7c764dd68b-m4lt6\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:11:00.315543 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.315510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" event={"ID":"168fefc9-6f1d-42a3-8e99-e9ae9bee9760","Type":"ContainerStarted","Data":"3c4c50bf171028d940364bed5af040fc9b45d804d4747df277f9d62b14b3aad2"} Apr 23 18:11:00.482091 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.482050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:11:00.611424 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.611392 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9"] Apr 23 18:11:00.616282 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.616258 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.618451 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.618427 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 23 18:11:00.618585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.618496 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 23 18:11:00.619574 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.619554 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6"] Apr 23 18:11:00.622635 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:11:00.622614 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9103716_7953_41cb_aa5a_fcd84af6c727.slice/crio-7d5c29ae735ee6c8bd96fcba9998dc42e2239ba755310baad2a23eca4f00b726 WatchSource:0}: Error finding container 7d5c29ae735ee6c8bd96fcba9998dc42e2239ba755310baad2a23eca4f00b726: Status 404 returned error can't find the container with id 7d5c29ae735ee6c8bd96fcba9998dc42e2239ba755310baad2a23eca4f00b726 Apr 23 18:11:00.627788 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.627762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9"] Apr 23 18:11:00.703610 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.703577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpk8\" (UniqueName: \"kubernetes.io/projected/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kube-api-access-xlpk8\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.703787 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.703629 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.703787 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.703723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.703787 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.703755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.804662 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.804568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.804662 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.804625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.804923 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.804690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpk8\" (UniqueName: \"kubernetes.io/projected/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kube-api-access-xlpk8\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.804923 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.804743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.805241 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.805210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.805402 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.805382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.807354 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.807332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.813057 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.813035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpk8\" (UniqueName: \"kubernetes.io/projected/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kube-api-access-xlpk8\") pod \"isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:00.929448 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:00.929412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:01.079671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:01.079499 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9"] Apr 23 18:11:01.092447 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:11:01.092409 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b621e1_9f6f_4cdf_99ad_766c2ff177fa.slice/crio-42169bc5f5765a283094afe7cf2f4a3ff424e3b2a37af39750c56e78ddc4cb31 WatchSource:0}: Error finding container 42169bc5f5765a283094afe7cf2f4a3ff424e3b2a37af39750c56e78ddc4cb31: Status 404 returned error can't find the container with id 42169bc5f5765a283094afe7cf2f4a3ff424e3b2a37af39750c56e78ddc4cb31 Apr 23 18:11:01.326494 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:01.326451 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerStarted","Data":"42169bc5f5765a283094afe7cf2f4a3ff424e3b2a37af39750c56e78ddc4cb31"} Apr 23 18:11:01.330278 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:01.330121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" event={"ID":"d9103716-7953-41cb-aa5a-fcd84af6c727","Type":"ContainerStarted","Data":"7d5c29ae735ee6c8bd96fcba9998dc42e2239ba755310baad2a23eca4f00b726"} Apr 23 18:11:19.420441 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:19.420396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" event={"ID":"168fefc9-6f1d-42a3-8e99-e9ae9bee9760","Type":"ContainerStarted","Data":"11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c"} Apr 23 18:11:19.422427 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:19.422394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" event={"ID":"d9103716-7953-41cb-aa5a-fcd84af6c727","Type":"ContainerStarted","Data":"622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1"} Apr 23 18:11:19.424547 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:19.424518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerStarted","Data":"c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64"} Apr 23 18:11:21.433873 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:21.433802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" event={"ID":"168fefc9-6f1d-42a3-8e99-e9ae9bee9760","Type":"ContainerStarted","Data":"a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d"} Apr 23 18:11:21.434380 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:21.433943 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:21.435433 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:21.435412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" event={"ID":"d9103716-7953-41cb-aa5a-fcd84af6c727","Type":"ContainerStarted","Data":"dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72"} Apr 23 18:11:21.435544 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:21.435506 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:11:21.453902 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:21.453807 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podStartSLOduration=1.512365552 podStartE2EDuration="22.453792335s" podCreationTimestamp="2026-04-23 18:10:59 +0000 UTC" firstStartedPulling="2026-04-23 18:11:00.298037367 +0000 UTC m=+1047.844012712" lastFinishedPulling="2026-04-23 18:11:21.239464149 +0000 UTC m=+1068.785439495" observedRunningTime="2026-04-23 18:11:21.451849937 +0000 UTC m=+1068.997825298" watchObservedRunningTime="2026-04-23 18:11:21.453792335 +0000 UTC m=+1068.999767702" Apr 23 18:11:21.470936 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:21.470416 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podStartSLOduration=1.86740439 podStartE2EDuration="22.470396006s" podCreationTimestamp="2026-04-23 18:10:59 +0000 UTC" firstStartedPulling="2026-04-23 18:11:00.62533448 +0000 UTC m=+1048.171309840" lastFinishedPulling="2026-04-23 18:11:21.228326112 +0000 UTC m=+1068.774301456" observedRunningTime="2026-04-23 18:11:21.470132762 +0000 UTC m=+1069.016108131" watchObservedRunningTime="2026-04-23 18:11:21.470396006 +0000 UTC m=+1069.016371373" Apr 23 18:11:22.439304 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:22.439258 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:22.439304 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:22.439311 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:11:22.440413 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:22.440380 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:22.440545 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:22.440451 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:23.442750 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:23.442712 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerID="c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64" exitCode=0 Apr 23 18:11:23.443176 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:23.442793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerDied","Data":"c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64"} Apr 23 18:11:23.443292 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:23.443265 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:23.443378 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:23.443265 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:28.448971 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:28.448169 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:11:28.448971 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:28.448804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:11:28.448971 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:28.448807 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:28.449724 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:28.449475 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:30.470204 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:30.470170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerStarted","Data":"c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349"} Apr 23 18:11:30.470204 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:30.470211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerStarted","Data":"c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17"} Apr 23 18:11:31.473320 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:31.473291 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:31.494775 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:31.494720 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podStartSLOduration=2.354177916 podStartE2EDuration="31.494702342s" podCreationTimestamp="2026-04-23 18:11:00 +0000 UTC" firstStartedPulling="2026-04-23 18:11:01.093785393 +0000 UTC m=+1048.639760739" lastFinishedPulling="2026-04-23 18:11:30.234309814 +0000 UTC m=+1077.780285165" observedRunningTime="2026-04-23 18:11:31.493562841 +0000 UTC m=+1079.039538208" watchObservedRunningTime="2026-04-23 18:11:31.494702342 +0000 UTC m=+1079.040677709" Apr 23 18:11:32.476398 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:32.476366 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:32.477716 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:32.477685 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:11:33.479951 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:33.479913 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:11:38.448765 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:38.448719 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:38.449934 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:38.449907 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:38.484229 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:38.484196 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:11:38.484843 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:38.484815 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:11:48.448855 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:48.448814 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:48.450053 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:48.450029 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:48.484778 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:48.484730 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:11:58.449128 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:58.449088 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:58.449564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:58.449538 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:58.485756 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:11:58.485712 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:12:08.449136 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:08.449096 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:12:08.450897 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:08.450861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:12:08.485124 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:08.485079 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:12:18.449044 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:18.449013 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:12:18.485053 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:18.485004 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:12:28.485794 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:28.485753 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:12:33.650390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.650351 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6"] Apr 23 18:12:33.650848 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.650757 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" containerID="cri-o://622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1" gracePeriod=30 Apr 23 18:12:33.650848 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.650795 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kube-rbac-proxy" containerID="cri-o://dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72" gracePeriod=30 Apr 23 18:12:33.692573 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.692538 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j"] Apr 23 18:12:33.692966 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.692923 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" containerID="cri-o://11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c" gracePeriod=30 Apr 23 18:12:33.693078 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.692959 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kube-rbac-proxy" containerID="cri-o://a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d" gracePeriod=30 Apr 23 18:12:33.730060 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.730026 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd"] Apr 23 18:12:33.733701 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.733679 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.735640 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.735611 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-03167-predictor-serving-cert\"" Apr 23 18:12:33.735785 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.735646 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-03167-kube-rbac-proxy-sar-config\"" Apr 23 18:12:33.742187 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.742162 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd"] Apr 23 18:12:33.785047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.785004 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9"] Apr 23 18:12:33.788480 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.788455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.790219 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.790196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-03167-predictor-serving-cert\"" Apr 23 18:12:33.790342 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.790223 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-03167-kube-rbac-proxy-sar-config\"" Apr 23 18:12:33.797541 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.797501 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9"] Apr 23 18:12:33.866689 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.866646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/459bf2e3-dad1-46f5-bc0c-8705379a964c-success-200-isvc-03167-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.866875 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.866754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd83c524-2cee-4a78-b741-74d2bed83cce-error-404-isvc-03167-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.866875 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.866784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmjr6\" (UniqueName: \"kubernetes.io/projected/459bf2e3-dad1-46f5-bc0c-8705379a964c-kube-api-access-dmjr6\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.867001 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.866876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88c4x\" (UniqueName: \"kubernetes.io/projected/cd83c524-2cee-4a78-b741-74d2bed83cce-kube-api-access-88c4x\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.867001 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.866929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/459bf2e3-dad1-46f5-bc0c-8705379a964c-proxy-tls\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.867001 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.866990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.968202 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.968165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88c4x\" (UniqueName: \"kubernetes.io/projected/cd83c524-2cee-4a78-b741-74d2bed83cce-kube-api-access-88c4x\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.968390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.968210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/459bf2e3-dad1-46f5-bc0c-8705379a964c-proxy-tls\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.968390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.968253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.968390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.968321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/459bf2e3-dad1-46f5-bc0c-8705379a964c-success-200-isvc-03167-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.968390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.968378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd83c524-2cee-4a78-b741-74d2bed83cce-error-404-isvc-03167-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.968599 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:33.968399 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-03167-predictor-serving-cert: secret "error-404-isvc-03167-predictor-serving-cert" not found Apr 23 18:12:33.968599 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:33.968466 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls podName:cd83c524-2cee-4a78-b741-74d2bed83cce nodeName:}" failed. No retries permitted until 2026-04-23 18:12:34.468449022 +0000 UTC m=+1142.014424370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls") pod "error-404-isvc-03167-predictor-659dbf79ff-tgwf9" (UID: "cd83c524-2cee-4a78-b741-74d2bed83cce") : secret "error-404-isvc-03167-predictor-serving-cert" not found Apr 23 18:12:33.968599 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.968404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmjr6\" (UniqueName: \"kubernetes.io/projected/459bf2e3-dad1-46f5-bc0c-8705379a964c-kube-api-access-dmjr6\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.969100 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.969081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd83c524-2cee-4a78-b741-74d2bed83cce-error-404-isvc-03167-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:33.969250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.969228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/459bf2e3-dad1-46f5-bc0c-8705379a964c-success-200-isvc-03167-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.971006 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.970984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/459bf2e3-dad1-46f5-bc0c-8705379a964c-proxy-tls\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.977156 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.977128 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmjr6\" (UniqueName: \"kubernetes.io/projected/459bf2e3-dad1-46f5-bc0c-8705379a964c-kube-api-access-dmjr6\") pod \"success-200-isvc-03167-predictor-79d9b874d8-7mgnd\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:33.977317 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:33.977294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88c4x\" (UniqueName: \"kubernetes.io/projected/cd83c524-2cee-4a78-b741-74d2bed83cce-kube-api-access-88c4x\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:34.045503 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.045455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:34.187076 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.187050 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd"] Apr 23 18:12:34.189352 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:12:34.189319 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod459bf2e3_dad1_46f5_bc0c_8705379a964c.slice/crio-705fdcfaed449de322878b92dff5e7a4b9847a7aad929041e93e85b782d31cbc WatchSource:0}: Error finding container 705fdcfaed449de322878b92dff5e7a4b9847a7aad929041e93e85b782d31cbc: Status 404 returned error can't find the container with id 705fdcfaed449de322878b92dff5e7a4b9847a7aad929041e93e85b782d31cbc Apr 23 18:12:34.474918 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.474793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:34.475085 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:34.474998 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-03167-predictor-serving-cert: secret "error-404-isvc-03167-predictor-serving-cert" not found Apr 23 18:12:34.475085 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:34.475075 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls podName:cd83c524-2cee-4a78-b741-74d2bed83cce nodeName:}" failed. No retries permitted until 2026-04-23 18:12:35.475052476 +0000 UTC m=+1143.021027825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls") pod "error-404-isvc-03167-predictor-659dbf79ff-tgwf9" (UID: "cd83c524-2cee-4a78-b741-74d2bed83cce") : secret "error-404-isvc-03167-predictor-serving-cert" not found Apr 23 18:12:34.676315 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.676282 2575 generic.go:358] "Generic (PLEG): container finished" podID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerID="dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72" exitCode=2 Apr 23 18:12:34.676725 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.676355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" event={"ID":"d9103716-7953-41cb-aa5a-fcd84af6c727","Type":"ContainerDied","Data":"dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72"} Apr 23 18:12:34.677921 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.677895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" event={"ID":"459bf2e3-dad1-46f5-bc0c-8705379a964c","Type":"ContainerStarted","Data":"a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b"} Apr 23 18:12:34.678047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.677927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" event={"ID":"459bf2e3-dad1-46f5-bc0c-8705379a964c","Type":"ContainerStarted","Data":"4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5"} Apr 23 18:12:34.678047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.677937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" event={"ID":"459bf2e3-dad1-46f5-bc0c-8705379a964c","Type":"ContainerStarted","Data":"705fdcfaed449de322878b92dff5e7a4b9847a7aad929041e93e85b782d31cbc"} Apr 23 18:12:34.678047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.677989 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:34.679438 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.679411 2575 generic.go:358] "Generic (PLEG): container finished" podID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerID="a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d" exitCode=2 Apr 23 18:12:34.679517 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.679445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" event={"ID":"168fefc9-6f1d-42a3-8e99-e9ae9bee9760","Type":"ContainerDied","Data":"a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d"} Apr 23 18:12:34.695282 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:34.695236 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podStartSLOduration=1.6952161590000001 podStartE2EDuration="1.695216159s" podCreationTimestamp="2026-04-23 18:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:12:34.693518606 +0000 UTC m=+1142.239493985" watchObservedRunningTime="2026-04-23 18:12:34.695216159 +0000 UTC m=+1142.241191530" Apr 23 18:12:35.483836 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:35.483786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:35.486525 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:35.486493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") pod \"error-404-isvc-03167-predictor-659dbf79ff-tgwf9\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:35.600901 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:35.600819 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:35.683751 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:35.683712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:35.685597 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:35.685232 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:12:35.738266 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:35.738237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9"] Apr 23 18:12:35.740509 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:12:35.740478 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd83c524_2cee_4a78_b741_74d2bed83cce.slice/crio-18bc44c8b6a359e41898ff130174e64f5f4e2a26bdd9a8d988ac114f63785ea3 WatchSource:0}: Error finding container 18bc44c8b6a359e41898ff130174e64f5f4e2a26bdd9a8d988ac114f63785ea3: Status 404 returned error can't find the container with id 18bc44c8b6a359e41898ff130174e64f5f4e2a26bdd9a8d988ac114f63785ea3 Apr 23 18:12:36.688468 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:36.688428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" event={"ID":"cd83c524-2cee-4a78-b741-74d2bed83cce","Type":"ContainerStarted","Data":"bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430"} Apr 23 18:12:36.688468 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:36.688474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" event={"ID":"cd83c524-2cee-4a78-b741-74d2bed83cce","Type":"ContainerStarted","Data":"7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9"} Apr 23 18:12:36.688951 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:36.688486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" event={"ID":"cd83c524-2cee-4a78-b741-74d2bed83cce","Type":"ContainerStarted","Data":"18bc44c8b6a359e41898ff130174e64f5f4e2a26bdd9a8d988ac114f63785ea3"} Apr 23 18:12:36.688951 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:36.688652 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:36.688951 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:36.688848 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:12:36.707091 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:36.707032 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podStartSLOduration=3.70701672 podStartE2EDuration="3.70701672s" podCreationTimestamp="2026-04-23 18:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:12:36.70375949 +0000 UTC m=+1144.249734871" watchObservedRunningTime="2026-04-23 18:12:36.70701672 +0000 UTC m=+1144.252992087" Apr 23 18:12:37.315159 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.315095 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:12:37.402307 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.402271 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls\") pod \"d9103716-7953-41cb-aa5a-fcd84af6c727\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " Apr 23 18:12:37.402307 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.402325 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9103716-7953-41cb-aa5a-fcd84af6c727-success-200-isvc-be430-kube-rbac-proxy-sar-config\") pod \"d9103716-7953-41cb-aa5a-fcd84af6c727\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " Apr 23 18:12:37.402571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.402424 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv9r7\" (UniqueName: \"kubernetes.io/projected/d9103716-7953-41cb-aa5a-fcd84af6c727-kube-api-access-rv9r7\") pod \"d9103716-7953-41cb-aa5a-fcd84af6c727\" (UID: \"d9103716-7953-41cb-aa5a-fcd84af6c727\") " Apr 23 18:12:37.402695 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.402667 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9103716-7953-41cb-aa5a-fcd84af6c727-success-200-isvc-be430-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-be430-kube-rbac-proxy-sar-config") pod "d9103716-7953-41cb-aa5a-fcd84af6c727" (UID: "d9103716-7953-41cb-aa5a-fcd84af6c727"). InnerVolumeSpecName "success-200-isvc-be430-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:12:37.404741 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.404712 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d9103716-7953-41cb-aa5a-fcd84af6c727" (UID: "d9103716-7953-41cb-aa5a-fcd84af6c727"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:12:37.404832 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.404782 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9103716-7953-41cb-aa5a-fcd84af6c727-kube-api-access-rv9r7" (OuterVolumeSpecName: "kube-api-access-rv9r7") pod "d9103716-7953-41cb-aa5a-fcd84af6c727" (UID: "d9103716-7953-41cb-aa5a-fcd84af6c727"). InnerVolumeSpecName "kube-api-access-rv9r7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:12:37.503973 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.503940 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rv9r7\" (UniqueName: \"kubernetes.io/projected/d9103716-7953-41cb-aa5a-fcd84af6c727-kube-api-access-rv9r7\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:12:37.503973 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.503971 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9103716-7953-41cb-aa5a-fcd84af6c727-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:12:37.503973 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.503982 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9103716-7953-41cb-aa5a-fcd84af6c727-success-200-isvc-be430-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:12:37.632543 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.632518 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:12:37.693324 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.693285 2575 generic.go:358] "Generic (PLEG): container finished" podID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerID="11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c" exitCode=0 Apr 23 18:12:37.693712 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.693364 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" Apr 23 18:12:37.693712 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.693373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" event={"ID":"168fefc9-6f1d-42a3-8e99-e9ae9bee9760","Type":"ContainerDied","Data":"11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c"} Apr 23 18:12:37.693712 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.693410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j" event={"ID":"168fefc9-6f1d-42a3-8e99-e9ae9bee9760","Type":"ContainerDied","Data":"3c4c50bf171028d940364bed5af040fc9b45d804d4747df277f9d62b14b3aad2"} Apr 23 18:12:37.693712 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.693435 2575 scope.go:117] "RemoveContainer" containerID="a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d" Apr 23 18:12:37.694907 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.694863 2575 generic.go:358] "Generic (PLEG): container finished" podID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerID="622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1" exitCode=0 Apr 23 18:12:37.695037 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.694956 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" Apr 23 18:12:37.695100 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.694954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" event={"ID":"d9103716-7953-41cb-aa5a-fcd84af6c727","Type":"ContainerDied","Data":"622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1"} Apr 23 18:12:37.695100 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.695057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6" event={"ID":"d9103716-7953-41cb-aa5a-fcd84af6c727","Type":"ContainerDied","Data":"7d5c29ae735ee6c8bd96fcba9998dc42e2239ba755310baad2a23eca4f00b726"} Apr 23 18:12:37.695763 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.695399 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:37.696861 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.696833 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:12:37.703057 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.703032 2575 scope.go:117] "RemoveContainer" containerID="11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c" Apr 23 18:12:37.711240 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.711219 2575 scope.go:117] "RemoveContainer" containerID="a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d" Apr 23 18:12:37.711506 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:37.711486 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d\": container with ID starting with a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d not found: ID does not exist" containerID="a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d" Apr 23 18:12:37.711572 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.711520 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d"} err="failed to get container status \"a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d\": rpc error: code = NotFound desc = could not find container \"a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d\": container with ID starting with a9bea182518c1031ac41503ef37d1e60abe5e277ab10c978fb2c47e793e6629d not found: ID does not exist" Apr 23 18:12:37.711572 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.711547 2575 scope.go:117] "RemoveContainer" containerID="11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c" Apr 23 18:12:37.711829 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:37.711792 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c\": container with ID starting with 11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c not found: ID does not exist" containerID="11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c" Apr 23 18:12:37.711872 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.711835 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c"} err="failed to get container status \"11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c\": rpc error: code = NotFound desc = could not find container \"11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c\": container with ID starting with 11d928aa506fd40c9ed81d555af25c9b33497840093cab13e2f3cde75391db2c not found: ID does not exist" Apr 23 18:12:37.711872 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.711858 2575 scope.go:117] "RemoveContainer" containerID="dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72" Apr 23 18:12:37.717066 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.717046 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6"] Apr 23 18:12:37.721384 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.721358 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be430-predictor-7c764dd68b-m4lt6"] Apr 23 18:12:37.721634 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.721615 2575 scope.go:117] "RemoveContainer" containerID="622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1" Apr 23 18:12:37.729631 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.729603 2575 scope.go:117] "RemoveContainer" containerID="dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72" Apr 23 18:12:37.729930 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:37.729912 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72\": container with ID starting with dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72 not found: ID does not exist" containerID="dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72" Apr 23 18:12:37.729993 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.729939 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72"} err="failed to get container status \"dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72\": rpc error: code = NotFound desc = could not find container \"dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72\": container with ID starting with dc64e1171f3325d5a2edd1daca4c5a8ee9fcf3e66ba227e019df6b639eb1fc72 not found: ID does not exist" Apr 23 18:12:37.729993 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.729959 2575 scope.go:117] "RemoveContainer" containerID="622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1" Apr 23 18:12:37.730178 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:12:37.730166 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1\": container with ID starting with 622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1 not found: ID does not exist" containerID="622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1" Apr 23 18:12:37.730220 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.730181 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1"} err="failed to get container status \"622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1\": rpc error: code = NotFound desc = could not find container \"622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1\": container with ID starting with 622bc23def508707fa6b9fd31e4b5ecb040cf4d178151343ff8c2263c2c61ea1 not found: ID does not exist" Apr 23 18:12:37.805651 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.805617 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbj68\" (UniqueName: \"kubernetes.io/projected/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-kube-api-access-qbj68\") pod \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " Apr 23 18:12:37.805829 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.805687 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-error-404-isvc-be430-kube-rbac-proxy-sar-config\") pod \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " Apr 23 18:12:37.805829 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.805731 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-proxy-tls\") pod \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\" (UID: \"168fefc9-6f1d-42a3-8e99-e9ae9bee9760\") " Apr 23 18:12:37.806184 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.806149 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-error-404-isvc-be430-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-be430-kube-rbac-proxy-sar-config") pod "168fefc9-6f1d-42a3-8e99-e9ae9bee9760" (UID: "168fefc9-6f1d-42a3-8e99-e9ae9bee9760"). InnerVolumeSpecName "error-404-isvc-be430-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:12:37.808086 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.808063 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "168fefc9-6f1d-42a3-8e99-e9ae9bee9760" (UID: "168fefc9-6f1d-42a3-8e99-e9ae9bee9760"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:12:37.808171 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.808104 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-kube-api-access-qbj68" (OuterVolumeSpecName: "kube-api-access-qbj68") pod "168fefc9-6f1d-42a3-8e99-e9ae9bee9760" (UID: "168fefc9-6f1d-42a3-8e99-e9ae9bee9760"). InnerVolumeSpecName "kube-api-access-qbj68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:12:37.907460 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.907354 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:12:37.907460 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.907396 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbj68\" (UniqueName: \"kubernetes.io/projected/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-kube-api-access-qbj68\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:12:37.907460 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:37.907413 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-be430-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/168fefc9-6f1d-42a3-8e99-e9ae9bee9760-error-404-isvc-be430-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:12:38.015857 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:38.015819 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j"] Apr 23 18:12:38.018436 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:38.018404 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be430-predictor-6bcf86bc7-2wc2j"] Apr 23 18:12:38.485510 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:38.485473 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 18:12:38.700419 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:38.700381 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:12:39.064136 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:39.064091 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" path="/var/lib/kubelet/pods/168fefc9-6f1d-42a3-8e99-e9ae9bee9760/volumes" Apr 23 18:12:39.064528 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:39.064513 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" path="/var/lib/kubelet/pods/d9103716-7953-41cb-aa5a-fcd84af6c727/volumes" Apr 23 18:12:41.693561 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:41.693526 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:12:41.694121 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:41.694091 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:12:43.705237 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:43.705203 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:12:43.705759 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:43.705731 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:12:48.485606 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:48.485572 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:12:51.694369 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:51.694329 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:12:53.705982 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:12:53.705937 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:01.694374 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:01.694334 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:13:03.706216 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:03.706164 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:11.694301 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:11.694253 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:13:13.706115 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:13.706072 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:19.610161 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610126 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4"] Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610497 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kube-rbac-proxy" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610513 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kube-rbac-proxy" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610524 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610530 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610540 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kube-rbac-proxy" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610545 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kube-rbac-proxy" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610557 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" Apr 23 18:13:19.610560 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610563 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" Apr 23 18:13:19.610796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610630 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kube-rbac-proxy" Apr 23 18:13:19.610796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610642 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="168fefc9-6f1d-42a3-8e99-e9ae9bee9760" containerName="kserve-container" Apr 23 18:13:19.610796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610648 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kserve-container" Apr 23 18:13:19.610796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.610657 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9103716-7953-41cb-aa5a-fcd84af6c727" containerName="kube-rbac-proxy" Apr 23 18:13:19.613058 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.613035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.615167 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.615148 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-89c6b-predictor-serving-cert\"" Apr 23 18:13:19.615282 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.615182 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-89c6b-kube-rbac-proxy-sar-config\"" Apr 23 18:13:19.625685 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.625659 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4"] Apr 23 18:13:19.680748 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.680714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63aab03a-546c-4970-b61d-5e334d73e842-success-200-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.680959 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.680755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5w2j\" (UniqueName: \"kubernetes.io/projected/63aab03a-546c-4970-b61d-5e334d73e842-kube-api-access-q5w2j\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.680959 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.680785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63aab03a-546c-4970-b61d-5e334d73e842-proxy-tls\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.704027 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.703991 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8"] Apr 23 18:13:19.706612 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.706596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.708536 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.708511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-89c6b-predictor-serving-cert\"" Apr 23 18:13:19.709140 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.709117 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-89c6b-kube-rbac-proxy-sar-config\"" Apr 23 18:13:19.710279 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.710257 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9"] Apr 23 18:13:19.710646 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.710606 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" containerID="cri-o://c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17" gracePeriod=30 Apr 23 18:13:19.710739 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.710643 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kube-rbac-proxy" containerID="cri-o://c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349" gracePeriod=30 Apr 23 18:13:19.717397 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.717367 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8"] Apr 23 18:13:19.781444 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.781410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5w2j\" (UniqueName: \"kubernetes.io/projected/63aab03a-546c-4970-b61d-5e334d73e842-kube-api-access-q5w2j\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.781631 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.781453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63aab03a-546c-4970-b61d-5e334d73e842-proxy-tls\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.781631 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.781507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-error-404-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.781631 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.781563 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-proxy-tls\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.781631 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.781596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrgq\" (UniqueName: \"kubernetes.io/projected/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-kube-api-access-gcrgq\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.781802 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.781771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63aab03a-546c-4970-b61d-5e334d73e842-success-200-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.782400 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.782375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63aab03a-546c-4970-b61d-5e334d73e842-success-200-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.784117 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.784097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63aab03a-546c-4970-b61d-5e334d73e842-proxy-tls\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.789727 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.789689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5w2j\" (UniqueName: \"kubernetes.io/projected/63aab03a-546c-4970-b61d-5e334d73e842-kube-api-access-q5w2j\") pod \"success-200-isvc-89c6b-predictor-9f5f766-8s6w4\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:19.848399 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.848365 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerID="c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349" exitCode=2 Apr 23 18:13:19.848567 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.848446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerDied","Data":"c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349"} Apr 23 18:13:19.882867 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.882767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrgq\" (UniqueName: \"kubernetes.io/projected/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-kube-api-access-gcrgq\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.883253 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.882914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-error-404-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.883253 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.882979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-proxy-tls\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.883605 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.883572 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-error-404-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.885582 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.885556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-proxy-tls\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.890807 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.890782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrgq\" (UniqueName: \"kubernetes.io/projected/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-kube-api-access-gcrgq\") pod \"error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:19.925037 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:19.925002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:20.019409 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.019365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:20.059979 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.059156 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4"] Apr 23 18:13:20.062603 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:13:20.062560 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63aab03a_546c_4970_b61d_5e334d73e842.slice/crio-a4ee6d406bee5c44ad3217c19ed3fc23f8edc67b6a0edb51a2d53e602dfc30cc WatchSource:0}: Error finding container a4ee6d406bee5c44ad3217c19ed3fc23f8edc67b6a0edb51a2d53e602dfc30cc: Status 404 returned error can't find the container with id a4ee6d406bee5c44ad3217c19ed3fc23f8edc67b6a0edb51a2d53e602dfc30cc Apr 23 18:13:20.154301 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.154272 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8"] Apr 23 18:13:20.159749 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:13:20.159708 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749e09bb_9d8f_4c6f_bf1c_12b7d3f4515b.slice/crio-b292545803b860f6e0de3043cfdbca8bdbe543544602576b167f7a0484b5572d WatchSource:0}: Error finding container b292545803b860f6e0de3043cfdbca8bdbe543544602576b167f7a0484b5572d: Status 404 returned error can't find the container with id b292545803b860f6e0de3043cfdbca8bdbe543544602576b167f7a0484b5572d Apr 23 18:13:20.852780 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.852747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" event={"ID":"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b","Type":"ContainerStarted","Data":"4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96"} Apr 23 18:13:20.853286 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.852787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" event={"ID":"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b","Type":"ContainerStarted","Data":"560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b"} Apr 23 18:13:20.853286 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.852802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" event={"ID":"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b","Type":"ContainerStarted","Data":"b292545803b860f6e0de3043cfdbca8bdbe543544602576b167f7a0484b5572d"} Apr 23 18:13:20.853286 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.852922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:20.854409 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.854387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" event={"ID":"63aab03a-546c-4970-b61d-5e334d73e842","Type":"ContainerStarted","Data":"894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68"} Apr 23 18:13:20.854524 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.854414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" event={"ID":"63aab03a-546c-4970-b61d-5e334d73e842","Type":"ContainerStarted","Data":"b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be"} Apr 23 18:13:20.854524 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.854424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" event={"ID":"63aab03a-546c-4970-b61d-5e334d73e842","Type":"ContainerStarted","Data":"a4ee6d406bee5c44ad3217c19ed3fc23f8edc67b6a0edb51a2d53e602dfc30cc"} Apr 23 18:13:20.854524 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.854508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:20.870940 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.870864 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podStartSLOduration=1.8708479759999999 podStartE2EDuration="1.870847976s" podCreationTimestamp="2026-04-23 18:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:13:20.869503567 +0000 UTC m=+1188.415478934" watchObservedRunningTime="2026-04-23 18:13:20.870847976 +0000 UTC m=+1188.416823343" Apr 23 18:13:20.886022 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:20.885968 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podStartSLOduration=1.885952917 podStartE2EDuration="1.885952917s" podCreationTimestamp="2026-04-23 18:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:13:20.884774805 +0000 UTC m=+1188.430750172" watchObservedRunningTime="2026-04-23 18:13:20.885952917 +0000 UTC m=+1188.431928285" Apr 23 18:13:21.694527 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:21.694486 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:13:21.858871 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:21.858838 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:21.859438 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:21.859408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:21.860051 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:21.860017 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:13:21.860957 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:21.860923 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:13:22.862858 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:22.862815 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:13:22.863464 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:22.862844 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:13:23.480456 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:23.480409 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 23 18:13:23.707061 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:23.707031 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:13:24.653662 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.653634 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:13:24.836030 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.835929 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kserve-provision-location\") pod \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " Apr 23 18:13:24.836030 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.835980 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " Apr 23 18:13:24.836030 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.836031 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlpk8\" (UniqueName: \"kubernetes.io/projected/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kube-api-access-xlpk8\") pod \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " Apr 23 18:13:24.836310 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.836084 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-proxy-tls\") pod \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\" (UID: \"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa\") " Apr 23 18:13:24.836310 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.836278 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" (UID: "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:24.836604 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.836317 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" (UID: "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:24.836604 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.836405 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kserve-provision-location\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:13:24.836604 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.836418 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:13:24.838293 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.838269 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kube-api-access-xlpk8" (OuterVolumeSpecName: "kube-api-access-xlpk8") pod "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" (UID: "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa"). InnerVolumeSpecName "kube-api-access-xlpk8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:24.838354 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.838269 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" (UID: "c7b621e1-9f6f-4cdf-99ad-766c2ff177fa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:24.871390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.871348 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerID="c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17" exitCode=0 Apr 23 18:13:24.871531 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.871436 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" Apr 23 18:13:24.871531 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.871476 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerDied","Data":"c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17"} Apr 23 18:13:24.871531 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.871515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9" event={"ID":"c7b621e1-9f6f-4cdf-99ad-766c2ff177fa","Type":"ContainerDied","Data":"42169bc5f5765a283094afe7cf2f4a3ff424e3b2a37af39750c56e78ddc4cb31"} Apr 23 18:13:24.871531 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.871530 2575 scope.go:117] "RemoveContainer" containerID="c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349" Apr 23 18:13:24.880820 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.880798 2575 scope.go:117] "RemoveContainer" containerID="c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17" Apr 23 18:13:24.889014 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.888991 2575 scope.go:117] "RemoveContainer" containerID="c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64" Apr 23 18:13:24.894677 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.894652 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9"] Apr 23 18:13:24.896808 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.896792 2575 scope.go:117] "RemoveContainer" containerID="c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349" Apr 23 18:13:24.897133 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:13:24.897107 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349\": container with ID starting with c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349 not found: ID does not exist" containerID="c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349" Apr 23 18:13:24.897232 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.897140 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349"} err="failed to get container status \"c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349\": rpc error: code = NotFound desc = could not find container \"c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349\": container with ID starting with c72d6fd949de22dd3a2ae6b0742c8eee080107690c07fa8bb3b8c8f2f227f349 not found: ID does not exist" Apr 23 18:13:24.897232 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.897160 2575 scope.go:117] "RemoveContainer" containerID="c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17" Apr 23 18:13:24.897419 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:13:24.897401 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17\": container with ID starting with c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17 not found: ID does not exist" containerID="c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17" Apr 23 18:13:24.897458 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.897426 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17"} err="failed to get container status \"c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17\": rpc error: code = NotFound desc = could not find container \"c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17\": container with ID starting with c9fce760f15f9cbbbf9ee5c66a720ab7c0b332c31872e23161c28b10062d2f17 not found: ID does not exist" Apr 23 18:13:24.897458 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.897442 2575 scope.go:117] "RemoveContainer" containerID="c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64" Apr 23 18:13:24.897683 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:13:24.897662 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64\": container with ID starting with c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64 not found: ID does not exist" containerID="c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64" Apr 23 18:13:24.897779 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.897684 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64"} err="failed to get container status \"c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64\": rpc error: code = NotFound desc = could not find container \"c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64\": container with ID starting with c0e7a76ceaa084cd28b33c44e94b0a92095c053a4f5859666d57ac510f145a64 not found: ID does not exist" Apr 23 18:13:24.901768 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.901743 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7d9dcd4478-xpgv9"] Apr 23 18:13:24.937548 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.937511 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xlpk8\" (UniqueName: \"kubernetes.io/projected/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-kube-api-access-xlpk8\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:13:24.937548 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:24.937541 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:13:25.064109 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:25.064070 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" path="/var/lib/kubelet/pods/c7b621e1-9f6f-4cdf-99ad-766c2ff177fa/volumes" Apr 23 18:13:27.867467 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:27.867422 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:13:27.867833 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:27.867799 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:13:27.867944 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:27.867917 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:13:27.868292 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:27.868272 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:13:31.695306 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:31.695269 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:13:33.031301 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:33.031272 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:13:33.033477 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:33.033444 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:13:37.867931 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:37.867866 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:13:37.868393 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:37.868277 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:13:47.868629 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:47.868578 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:13:47.869171 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:47.868578 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:13:57.867983 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:57.867926 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:13:57.868430 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:13:57.868238 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:14:07.868709 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:14:07.868669 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:14:07.869094 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:14:07.868679 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:14:17.869100 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:14:17.869007 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:14:17.869100 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:14:17.869066 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:18:33.055165 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:18:33.055129 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:18:33.059131 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:18:33.059111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:21:58.594795 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.594766 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd"] Apr 23 18:21:58.596397 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.595070 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" containerID="cri-o://4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5" gracePeriod=30 Apr 23 18:21:58.596397 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.595097 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kube-rbac-proxy" containerID="cri-o://a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b" gracePeriod=30 Apr 23 18:21:58.655109 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.655079 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9"] Apr 23 18:21:58.655405 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.655370 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" containerID="cri-o://7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9" gracePeriod=30 Apr 23 18:21:58.655497 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.655414 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kube-rbac-proxy" containerID="cri-o://bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430" gracePeriod=30 Apr 23 18:21:58.700748 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.700718 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 23 18:21:58.725592 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.725566 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f"] Apr 23 18:21:58.725970 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.725955 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kube-rbac-proxy" Apr 23 18:21:58.725970 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.725971 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kube-rbac-proxy" Apr 23 18:21:58.726089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.725983 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" Apr 23 18:21:58.726089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.725988 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" Apr 23 18:21:58.726089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.725999 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="storage-initializer" Apr 23 18:21:58.726089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.726006 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="storage-initializer" Apr 23 18:21:58.726089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.726072 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kserve-container" Apr 23 18:21:58.726089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.726080 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7b621e1-9f6f-4cdf-99ad-766c2ff177fa" containerName="kube-rbac-proxy" Apr 23 18:21:58.728179 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.728163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.730248 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.730225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-5d865-predictor-serving-cert\"" Apr 23 18:21:58.730345 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.730226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-5d865-kube-rbac-proxy-sar-config\"" Apr 23 18:21:58.752165 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.752138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f"] Apr 23 18:21:58.758317 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.758291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.758436 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.758337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5xr\" (UniqueName: \"kubernetes.io/projected/f904c9ba-5277-42f1-8335-104599745997-kube-api-access-ql5xr\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.758436 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.758368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904c9ba-5277-42f1-8335-104599745997-success-200-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.777283 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.777257 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj"] Apr 23 18:21:58.779953 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.779939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.781827 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.781805 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-5d865-predictor-serving-cert\"" Apr 23 18:21:58.781932 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.781849 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-5d865-kube-rbac-proxy-sar-config\"" Apr 23 18:21:58.791481 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.791459 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj"] Apr 23 18:21:58.859914 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.859816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.859914 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.859898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5xr\" (UniqueName: \"kubernetes.io/projected/f904c9ba-5277-42f1-8335-104599745997-kube-api-access-ql5xr\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.860109 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.859940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904c9ba-5277-42f1-8335-104599745997-success-200-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.860109 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:21:58.859961 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-5d865-predictor-serving-cert: secret "success-200-isvc-5d865-predictor-serving-cert" not found Apr 23 18:21:58.860109 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.859978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93f60042-dc14-4c57-a360-01d9ac01b871-proxy-tls\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.860109 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:21:58.860027 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls podName:f904c9ba-5277-42f1-8335-104599745997 nodeName:}" failed. No retries permitted until 2026-04-23 18:21:59.360004941 +0000 UTC m=+1706.905980287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls") pod "success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" (UID: "f904c9ba-5277-42f1-8335-104599745997") : secret "success-200-isvc-5d865-predictor-serving-cert" not found Apr 23 18:21:58.860109 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.860044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxfv\" (UniqueName: \"kubernetes.io/projected/93f60042-dc14-4c57-a360-01d9ac01b871-kube-api-access-zgxfv\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.860329 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.860114 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93f60042-dc14-4c57-a360-01d9ac01b871-error-404-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.860709 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.860682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904c9ba-5277-42f1-8335-104599745997-success-200-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.868604 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.868575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5xr\" (UniqueName: \"kubernetes.io/projected/f904c9ba-5277-42f1-8335-104599745997-kube-api-access-ql5xr\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:58.961646 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.961602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93f60042-dc14-4c57-a360-01d9ac01b871-proxy-tls\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.961646 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.961648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxfv\" (UniqueName: \"kubernetes.io/projected/93f60042-dc14-4c57-a360-01d9ac01b871-kube-api-access-zgxfv\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.961928 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.961688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93f60042-dc14-4c57-a360-01d9ac01b871-error-404-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.962415 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.962387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93f60042-dc14-4c57-a360-01d9ac01b871-error-404-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.964347 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.964317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93f60042-dc14-4c57-a360-01d9ac01b871-proxy-tls\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:58.970717 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:58.970698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxfv\" (UniqueName: \"kubernetes.io/projected/93f60042-dc14-4c57-a360-01d9ac01b871-kube-api-access-zgxfv\") pod \"error-404-isvc-5d865-predictor-5bf4578b77-phhwj\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:59.090352 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.090321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:59.218346 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.218289 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj"] Apr 23 18:21:59.220550 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:21:59.220526 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f60042_dc14_4c57_a360_01d9ac01b871.slice/crio-dae771d89753707d38f1384a56d08a9803d677747008166ad3fc8c049dd145fb WatchSource:0}: Error finding container dae771d89753707d38f1384a56d08a9803d677747008166ad3fc8c049dd145fb: Status 404 returned error can't find the container with id dae771d89753707d38f1384a56d08a9803d677747008166ad3fc8c049dd145fb Apr 23 18:21:59.222335 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.222319 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:21:59.365914 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.365842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:59.368412 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.368389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls\") pod \"success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:59.596275 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.596242 2575 generic.go:358] "Generic (PLEG): container finished" podID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerID="a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b" exitCode=2 Apr 23 18:21:59.596668 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.596316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" event={"ID":"459bf2e3-dad1-46f5-bc0c-8705379a964c","Type":"ContainerDied","Data":"a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b"} Apr 23 18:21:59.597735 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.597712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" event={"ID":"93f60042-dc14-4c57-a360-01d9ac01b871","Type":"ContainerStarted","Data":"ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1"} Apr 23 18:21:59.597851 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.597742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" event={"ID":"93f60042-dc14-4c57-a360-01d9ac01b871","Type":"ContainerStarted","Data":"01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34"} Apr 23 18:21:59.597851 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.597755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" event={"ID":"93f60042-dc14-4c57-a360-01d9ac01b871","Type":"ContainerStarted","Data":"dae771d89753707d38f1384a56d08a9803d677747008166ad3fc8c049dd145fb"} Apr 23 18:21:59.597851 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.597834 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:59.597851 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.597852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:21:59.599332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.599302 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:21:59.599430 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.599388 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerID="bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430" exitCode=2 Apr 23 18:21:59.599482 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.599440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" event={"ID":"cd83c524-2cee-4a78-b741-74d2bed83cce","Type":"ContainerDied","Data":"bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430"} Apr 23 18:21:59.619290 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.619240 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podStartSLOduration=1.6192258210000001 podStartE2EDuration="1.619225821s" podCreationTimestamp="2026-04-23 18:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:21:59.6185856 +0000 UTC m=+1707.164560969" watchObservedRunningTime="2026-04-23 18:21:59.619225821 +0000 UTC m=+1707.165201190" Apr 23 18:21:59.639927 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.639908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:21:59.765585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:21:59.765537 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f"] Apr 23 18:21:59.767364 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:21:59.767319 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf904c9ba_5277_42f1_8335_104599745997.slice/crio-d13d0d1d68dbd71ad94af43344a335523ac0f428f03ecbe844aa20360c6cae58 WatchSource:0}: Error finding container d13d0d1d68dbd71ad94af43344a335523ac0f428f03ecbe844aa20360c6cae58: Status 404 returned error can't find the container with id d13d0d1d68dbd71ad94af43344a335523ac0f428f03ecbe844aa20360c6cae58 Apr 23 18:22:00.604106 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:00.604075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" event={"ID":"f904c9ba-5277-42f1-8335-104599745997","Type":"ContainerStarted","Data":"39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be"} Apr 23 18:22:00.604106 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:00.604112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" event={"ID":"f904c9ba-5277-42f1-8335-104599745997","Type":"ContainerStarted","Data":"e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1"} Apr 23 18:22:00.604595 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:00.604124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" event={"ID":"f904c9ba-5277-42f1-8335-104599745997","Type":"ContainerStarted","Data":"d13d0d1d68dbd71ad94af43344a335523ac0f428f03ecbe844aa20360c6cae58"} Apr 23 18:22:00.604595 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:00.604335 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:22:00.604595 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:00.604425 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:22:00.621870 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:00.621823 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podStartSLOduration=2.621809677 podStartE2EDuration="2.621809677s" podCreationTimestamp="2026-04-23 18:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:22:00.619705454 +0000 UTC m=+1708.165680820" watchObservedRunningTime="2026-04-23 18:22:00.621809677 +0000 UTC m=+1708.167785045" Apr 23 18:22:01.607636 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:01.607603 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:22:01.608869 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:01.608838 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:22:01.689355 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:01.689308 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 23 18:22:01.694014 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:01.693990 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 18:22:02.035731 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.035435 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:22:02.090432 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.090397 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd83c524-2cee-4a78-b741-74d2bed83cce-error-404-isvc-03167-kube-rbac-proxy-sar-config\") pod \"cd83c524-2cee-4a78-b741-74d2bed83cce\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " Apr 23 18:22:02.090576 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.090500 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") pod \"cd83c524-2cee-4a78-b741-74d2bed83cce\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " Apr 23 18:22:02.090576 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.090556 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88c4x\" (UniqueName: \"kubernetes.io/projected/cd83c524-2cee-4a78-b741-74d2bed83cce-kube-api-access-88c4x\") pod \"cd83c524-2cee-4a78-b741-74d2bed83cce\" (UID: \"cd83c524-2cee-4a78-b741-74d2bed83cce\") " Apr 23 18:22:02.090843 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.090811 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd83c524-2cee-4a78-b741-74d2bed83cce-error-404-isvc-03167-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-03167-kube-rbac-proxy-sar-config") pod "cd83c524-2cee-4a78-b741-74d2bed83cce" (UID: "cd83c524-2cee-4a78-b741-74d2bed83cce"). InnerVolumeSpecName "error-404-isvc-03167-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:22:02.093129 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.093105 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cd83c524-2cee-4a78-b741-74d2bed83cce" (UID: "cd83c524-2cee-4a78-b741-74d2bed83cce"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:22:02.093243 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.093106 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd83c524-2cee-4a78-b741-74d2bed83cce-kube-api-access-88c4x" (OuterVolumeSpecName: "kube-api-access-88c4x") pod "cd83c524-2cee-4a78-b741-74d2bed83cce" (UID: "cd83c524-2cee-4a78-b741-74d2bed83cce"). InnerVolumeSpecName "kube-api-access-88c4x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:22:02.140104 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.140080 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:22:02.191862 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.191826 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmjr6\" (UniqueName: \"kubernetes.io/projected/459bf2e3-dad1-46f5-bc0c-8705379a964c-kube-api-access-dmjr6\") pod \"459bf2e3-dad1-46f5-bc0c-8705379a964c\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " Apr 23 18:22:02.192018 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.191906 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/459bf2e3-dad1-46f5-bc0c-8705379a964c-success-200-isvc-03167-kube-rbac-proxy-sar-config\") pod \"459bf2e3-dad1-46f5-bc0c-8705379a964c\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " Apr 23 18:22:02.192018 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.191931 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/459bf2e3-dad1-46f5-bc0c-8705379a964c-proxy-tls\") pod \"459bf2e3-dad1-46f5-bc0c-8705379a964c\" (UID: \"459bf2e3-dad1-46f5-bc0c-8705379a964c\") " Apr 23 18:22:02.192102 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.192079 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88c4x\" (UniqueName: \"kubernetes.io/projected/cd83c524-2cee-4a78-b741-74d2bed83cce-kube-api-access-88c4x\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:02.192102 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.192091 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd83c524-2cee-4a78-b741-74d2bed83cce-error-404-isvc-03167-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:02.192102 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.192101 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd83c524-2cee-4a78-b741-74d2bed83cce-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:02.192232 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.192208 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/459bf2e3-dad1-46f5-bc0c-8705379a964c-success-200-isvc-03167-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-03167-kube-rbac-proxy-sar-config") pod "459bf2e3-dad1-46f5-bc0c-8705379a964c" (UID: "459bf2e3-dad1-46f5-bc0c-8705379a964c"). InnerVolumeSpecName "success-200-isvc-03167-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:22:02.194231 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.194196 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459bf2e3-dad1-46f5-bc0c-8705379a964c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "459bf2e3-dad1-46f5-bc0c-8705379a964c" (UID: "459bf2e3-dad1-46f5-bc0c-8705379a964c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:22:02.194231 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.194199 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459bf2e3-dad1-46f5-bc0c-8705379a964c-kube-api-access-dmjr6" (OuterVolumeSpecName: "kube-api-access-dmjr6") pod "459bf2e3-dad1-46f5-bc0c-8705379a964c" (UID: "459bf2e3-dad1-46f5-bc0c-8705379a964c"). InnerVolumeSpecName "kube-api-access-dmjr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:22:02.292816 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.292778 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-03167-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/459bf2e3-dad1-46f5-bc0c-8705379a964c-success-200-isvc-03167-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:02.292816 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.292809 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/459bf2e3-dad1-46f5-bc0c-8705379a964c-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:02.292816 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.292824 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dmjr6\" (UniqueName: \"kubernetes.io/projected/459bf2e3-dad1-46f5-bc0c-8705379a964c-kube-api-access-dmjr6\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:02.612026 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.611991 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerID="7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9" exitCode=0 Apr 23 18:22:02.612465 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.612059 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" Apr 23 18:22:02.612465 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.612061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" event={"ID":"cd83c524-2cee-4a78-b741-74d2bed83cce","Type":"ContainerDied","Data":"7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9"} Apr 23 18:22:02.612465 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.612105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9" event={"ID":"cd83c524-2cee-4a78-b741-74d2bed83cce","Type":"ContainerDied","Data":"18bc44c8b6a359e41898ff130174e64f5f4e2a26bdd9a8d988ac114f63785ea3"} Apr 23 18:22:02.612465 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.612125 2575 scope.go:117] "RemoveContainer" containerID="bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430" Apr 23 18:22:02.613650 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.613622 2575 generic.go:358] "Generic (PLEG): container finished" podID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerID="4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5" exitCode=0 Apr 23 18:22:02.613780 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.613658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" event={"ID":"459bf2e3-dad1-46f5-bc0c-8705379a964c","Type":"ContainerDied","Data":"4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5"} Apr 23 18:22:02.613780 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.613693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" event={"ID":"459bf2e3-dad1-46f5-bc0c-8705379a964c","Type":"ContainerDied","Data":"705fdcfaed449de322878b92dff5e7a4b9847a7aad929041e93e85b782d31cbc"} Apr 23 18:22:02.613780 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.613702 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd" Apr 23 18:22:02.614384 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.614359 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:22:02.621185 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.621055 2575 scope.go:117] "RemoveContainer" containerID="7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9" Apr 23 18:22:02.628490 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.628475 2575 scope.go:117] "RemoveContainer" containerID="bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430" Apr 23 18:22:02.628713 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:02.628696 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430\": container with ID starting with bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430 not found: ID does not exist" containerID="bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430" Apr 23 18:22:02.628764 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.628720 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430"} err="failed to get container status \"bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430\": rpc error: code = NotFound desc = could not find container \"bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430\": container with ID starting with bd57a190e543382f913f73225a80187c5d8800ae2cef2cc5520a846a33222430 not found: ID does not exist" Apr 23 18:22:02.628764 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.628738 2575 scope.go:117] "RemoveContainer" containerID="7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9" Apr 23 18:22:02.629009 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:02.628994 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9\": container with ID starting with 7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9 not found: ID does not exist" containerID="7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9" Apr 23 18:22:02.629068 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.629013 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9"} err="failed to get container status \"7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9\": rpc error: code = NotFound desc = could not find container \"7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9\": container with ID starting with 7d4d8fa2be6f82456776d4d2e86ca70f2691b07e23c42289423a50345da94be9 not found: ID does not exist" Apr 23 18:22:02.629068 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.629026 2575 scope.go:117] "RemoveContainer" containerID="a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b" Apr 23 18:22:02.642208 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.640432 2575 scope.go:117] "RemoveContainer" containerID="4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5" Apr 23 18:22:02.643525 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.643505 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd"] Apr 23 18:22:02.645184 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.645165 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-03167-predictor-79d9b874d8-7mgnd"] Apr 23 18:22:02.649469 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.649448 2575 scope.go:117] "RemoveContainer" containerID="a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b" Apr 23 18:22:02.649801 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:02.649780 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b\": container with ID starting with a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b not found: ID does not exist" containerID="a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b" Apr 23 18:22:02.649858 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.649808 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b"} err="failed to get container status \"a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b\": rpc error: code = NotFound desc = could not find container \"a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b\": container with ID starting with a40e2290424b48aa3c0ec135891475d0fec06379700e8526b0ca12eef014641b not found: ID does not exist" Apr 23 18:22:02.649858 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.649826 2575 scope.go:117] "RemoveContainer" containerID="4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5" Apr 23 18:22:02.650087 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:02.650071 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5\": container with ID starting with 4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5 not found: ID does not exist" containerID="4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5" Apr 23 18:22:02.650129 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.650092 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5"} err="failed to get container status \"4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5\": rpc error: code = NotFound desc = could not find container \"4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5\": container with ID starting with 4baab0c761975d02c458112dc05c69dd1db71b12ecad999fcd34d352f24529c5 not found: ID does not exist" Apr 23 18:22:02.653307 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.653285 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9"] Apr 23 18:22:02.656585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:02.656565 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03167-predictor-659dbf79ff-tgwf9"] Apr 23 18:22:03.062370 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:03.062330 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" path="/var/lib/kubelet/pods/459bf2e3-dad1-46f5-bc0c-8705379a964c/volumes" Apr 23 18:22:03.062745 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:03.062729 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" path="/var/lib/kubelet/pods/cd83c524-2cee-4a78-b741-74d2bed83cce/volumes" Apr 23 18:22:05.609533 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:05.609503 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:22:05.610038 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:05.610009 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:22:07.618524 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:07.618498 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:22:07.619114 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:07.619089 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:22:15.610401 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:15.610364 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:22:17.619140 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:17.619101 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:22:25.610657 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:25.610620 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:22:27.619651 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:27.619611 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:22:35.610444 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:35.610400 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:22:37.619715 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:37.619675 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:22:44.481147 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.481108 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4"] Apr 23 18:22:44.481594 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.481417 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" containerID="cri-o://b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be" gracePeriod=30 Apr 23 18:22:44.481594 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.481458 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kube-rbac-proxy" containerID="cri-o://894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68" gracePeriod=30 Apr 23 18:22:44.532993 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.532963 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g"] Apr 23 18:22:44.533344 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533331 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533346 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533362 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533368 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533374 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kube-rbac-proxy" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533379 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kube-rbac-proxy" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533391 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kube-rbac-proxy" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533396 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kube-rbac-proxy" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533448 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kube-rbac-proxy" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533457 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kserve-container" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533467 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="459bf2e3-dad1-46f5-bc0c-8705379a964c" containerName="kserve-container" Apr 23 18:22:44.533550 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.533473 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd83c524-2cee-4a78-b741-74d2bed83cce" containerName="kube-rbac-proxy" Apr 23 18:22:44.537655 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.537636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.539645 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.539615 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-89ff7-predictor-serving-cert\"" Apr 23 18:22:44.539645 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.539628 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-89ff7-kube-rbac-proxy-sar-config\"" Apr 23 18:22:44.544575 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.542637 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8"] Apr 23 18:22:44.544575 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.543091 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" containerID="cri-o://560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b" gracePeriod=30 Apr 23 18:22:44.544575 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.543288 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kube-rbac-proxy" containerID="cri-o://4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96" gracePeriod=30 Apr 23 18:22:44.548056 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.548009 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g"] Apr 23 18:22:44.567827 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.567801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjw6\" (UniqueName: \"kubernetes.io/projected/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-kube-api-access-bjjw6\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.567952 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.567901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-success-200-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.568004 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.567987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.661993 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.661959 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb"] Apr 23 18:22:44.665618 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.665594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.667738 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.667716 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-89ff7-predictor-serving-cert\"" Apr 23 18:22:44.667821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.667736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-89ff7-kube-rbac-proxy-sar-config\"" Apr 23 18:22:44.668354 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.668328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-success-200-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.668455 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.668373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.668455 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.668408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4f7x\" (UniqueName: \"kubernetes.io/projected/5227b4d9-66cc-41bb-959c-f3591b46323d-kube-api-access-f4f7x\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.668455 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.668429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5227b4d9-66cc-41bb-959c-f3591b46323d-error-404-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.668455 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.668451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjw6\" (UniqueName: \"kubernetes.io/projected/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-kube-api-access-bjjw6\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.668657 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.668498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5227b4d9-66cc-41bb-959c-f3591b46323d-proxy-tls\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.668657 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:44.668556 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-serving-cert: secret "success-200-isvc-89ff7-predictor-serving-cert" not found Apr 23 18:22:44.668657 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:44.668612 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls podName:30ec9c27-aca5-42c9-ac1d-c7c522cdf327 nodeName:}" failed. No retries permitted until 2026-04-23 18:22:45.168598413 +0000 UTC m=+1752.714573758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls") pod "success-200-isvc-89ff7-predictor-55645f8766-rbr8g" (UID: "30ec9c27-aca5-42c9-ac1d-c7c522cdf327") : secret "success-200-isvc-89ff7-predictor-serving-cert" not found Apr 23 18:22:44.669070 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.669046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-success-200-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.675513 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.675489 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb"] Apr 23 18:22:44.683251 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.683230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjw6\" (UniqueName: \"kubernetes.io/projected/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-kube-api-access-bjjw6\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:44.760499 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.760399 2575 generic.go:358] "Generic (PLEG): container finished" podID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerID="4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96" exitCode=2 Apr 23 18:22:44.760499 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.760482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" event={"ID":"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b","Type":"ContainerDied","Data":"4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96"} Apr 23 18:22:44.762367 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.762342 2575 generic.go:358] "Generic (PLEG): container finished" podID="63aab03a-546c-4970-b61d-5e334d73e842" containerID="894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68" exitCode=2 Apr 23 18:22:44.762511 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.762398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" event={"ID":"63aab03a-546c-4970-b61d-5e334d73e842","Type":"ContainerDied","Data":"894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68"} Apr 23 18:22:44.769111 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.769089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4f7x\" (UniqueName: \"kubernetes.io/projected/5227b4d9-66cc-41bb-959c-f3591b46323d-kube-api-access-f4f7x\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.769207 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.769119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5227b4d9-66cc-41bb-959c-f3591b46323d-error-404-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.769207 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.769153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5227b4d9-66cc-41bb-959c-f3591b46323d-proxy-tls\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.769781 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.769762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5227b4d9-66cc-41bb-959c-f3591b46323d-error-404-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.771759 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.771743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5227b4d9-66cc-41bb-959c-f3591b46323d-proxy-tls\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.776983 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.776965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4f7x\" (UniqueName: \"kubernetes.io/projected/5227b4d9-66cc-41bb-959c-f3591b46323d-kube-api-access-f4f7x\") pod \"error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:44.976776 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:44.976735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:45.109458 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.109387 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb"] Apr 23 18:22:45.112309 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:22:45.112279 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5227b4d9_66cc_41bb_959c_f3591b46323d.slice/crio-4b7eaf2ae11789a999ac4ffc1c77a3bc76d27f84043e7ea4be6425006de115c1 WatchSource:0}: Error finding container 4b7eaf2ae11789a999ac4ffc1c77a3bc76d27f84043e7ea4be6425006de115c1: Status 404 returned error can't find the container with id 4b7eaf2ae11789a999ac4ffc1c77a3bc76d27f84043e7ea4be6425006de115c1 Apr 23 18:22:45.172761 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.172735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:45.175146 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.175124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls\") pod \"success-200-isvc-89ff7-predictor-55645f8766-rbr8g\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:45.456018 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.455987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:45.592211 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.592171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g"] Apr 23 18:22:45.594538 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:22:45.594508 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ec9c27_aca5_42c9_ac1d_c7c522cdf327.slice/crio-21e764a6a30d44f741909b7188a955a78224f631cd7eb16b74a1dae768c08422 WatchSource:0}: Error finding container 21e764a6a30d44f741909b7188a955a78224f631cd7eb16b74a1dae768c08422: Status 404 returned error can't find the container with id 21e764a6a30d44f741909b7188a955a78224f631cd7eb16b74a1dae768c08422 Apr 23 18:22:45.611607 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.611587 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:22:45.768969 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.768932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" event={"ID":"5227b4d9-66cc-41bb-959c-f3591b46323d","Type":"ContainerStarted","Data":"3ca14810ec4ee79d66dc34cfd76b87e3e0408fd904ce449b66eef33be194e713"} Apr 23 18:22:45.768969 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.768972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" event={"ID":"5227b4d9-66cc-41bb-959c-f3591b46323d","Type":"ContainerStarted","Data":"8a2727c541532cd880be8748634aeda3995fbf7b047624eaa551cf15481e0588"} Apr 23 18:22:45.769209 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.768989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" event={"ID":"5227b4d9-66cc-41bb-959c-f3591b46323d","Type":"ContainerStarted","Data":"4b7eaf2ae11789a999ac4ffc1c77a3bc76d27f84043e7ea4be6425006de115c1"} Apr 23 18:22:45.769209 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.769051 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:45.769209 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.769075 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:45.770759 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.770726 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:22:45.770937 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.770804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" event={"ID":"30ec9c27-aca5-42c9-ac1d-c7c522cdf327","Type":"ContainerStarted","Data":"06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3"} Apr 23 18:22:45.770937 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.770835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" event={"ID":"30ec9c27-aca5-42c9-ac1d-c7c522cdf327","Type":"ContainerStarted","Data":"dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd"} Apr 23 18:22:45.770937 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.770844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" event={"ID":"30ec9c27-aca5-42c9-ac1d-c7c522cdf327","Type":"ContainerStarted","Data":"21e764a6a30d44f741909b7188a955a78224f631cd7eb16b74a1dae768c08422"} Apr 23 18:22:45.770937 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.770910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:45.791752 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.791708 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podStartSLOduration=1.791696116 podStartE2EDuration="1.791696116s" podCreationTimestamp="2026-04-23 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:22:45.790136827 +0000 UTC m=+1753.336112195" watchObservedRunningTime="2026-04-23 18:22:45.791696116 +0000 UTC m=+1753.337671483" Apr 23 18:22:45.811622 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:45.811576 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podStartSLOduration=1.811562699 podStartE2EDuration="1.811562699s" podCreationTimestamp="2026-04-23 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:22:45.810086789 +0000 UTC m=+1753.356062157" watchObservedRunningTime="2026-04-23 18:22:45.811562699 +0000 UTC m=+1753.357538065" Apr 23 18:22:46.774763 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:46.774718 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:46.775160 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:46.774906 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:22:46.776139 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:46.776113 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:22:47.620033 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:47.620006 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:22:47.778430 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:47.778391 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:22:47.863942 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:47.863876 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 23 18:22:47.864139 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:47.863896 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 23 18:22:47.868194 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:47.868157 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:22:47.868354 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:47.868329 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:22:48.228992 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.228971 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:22:48.232019 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.232002 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:22:48.298766 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.298731 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-error-404-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " Apr 23 18:22:48.298766 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.298767 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcrgq\" (UniqueName: \"kubernetes.io/projected/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-kube-api-access-gcrgq\") pod \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " Apr 23 18:22:48.299032 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.298790 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63aab03a-546c-4970-b61d-5e334d73e842-success-200-isvc-89c6b-kube-rbac-proxy-sar-config\") pod \"63aab03a-546c-4970-b61d-5e334d73e842\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " Apr 23 18:22:48.299032 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.298811 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-proxy-tls\") pod \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\" (UID: \"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b\") " Apr 23 18:22:48.299032 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.298831 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5w2j\" (UniqueName: \"kubernetes.io/projected/63aab03a-546c-4970-b61d-5e334d73e842-kube-api-access-q5w2j\") pod \"63aab03a-546c-4970-b61d-5e334d73e842\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " Apr 23 18:22:48.299032 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.298901 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63aab03a-546c-4970-b61d-5e334d73e842-proxy-tls\") pod \"63aab03a-546c-4970-b61d-5e334d73e842\" (UID: \"63aab03a-546c-4970-b61d-5e334d73e842\") " Apr 23 18:22:48.299238 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.299173 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63aab03a-546c-4970-b61d-5e334d73e842-success-200-isvc-89c6b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-89c6b-kube-rbac-proxy-sar-config") pod "63aab03a-546c-4970-b61d-5e334d73e842" (UID: "63aab03a-546c-4970-b61d-5e334d73e842"). InnerVolumeSpecName "success-200-isvc-89c6b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:22:48.299238 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.299173 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-error-404-isvc-89c6b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-89c6b-kube-rbac-proxy-sar-config") pod "749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" (UID: "749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b"). InnerVolumeSpecName "error-404-isvc-89c6b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:22:48.301237 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.301208 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63aab03a-546c-4970-b61d-5e334d73e842-kube-api-access-q5w2j" (OuterVolumeSpecName: "kube-api-access-q5w2j") pod "63aab03a-546c-4970-b61d-5e334d73e842" (UID: "63aab03a-546c-4970-b61d-5e334d73e842"). InnerVolumeSpecName "kube-api-access-q5w2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:22:48.301237 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.301219 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" (UID: "749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:22:48.301430 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.301307 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-kube-api-access-gcrgq" (OuterVolumeSpecName: "kube-api-access-gcrgq") pod "749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" (UID: "749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b"). InnerVolumeSpecName "kube-api-access-gcrgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:22:48.301697 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.301675 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63aab03a-546c-4970-b61d-5e334d73e842-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "63aab03a-546c-4970-b61d-5e334d73e842" (UID: "63aab03a-546c-4970-b61d-5e334d73e842"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:22:48.400101 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.400046 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-error-404-isvc-89c6b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:48.400101 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.400094 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gcrgq\" (UniqueName: \"kubernetes.io/projected/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-kube-api-access-gcrgq\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:48.400101 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.400106 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-89c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63aab03a-546c-4970-b61d-5e334d73e842-success-200-isvc-89c6b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:48.400101 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.400116 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:48.400380 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.400126 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5w2j\" (UniqueName: \"kubernetes.io/projected/63aab03a-546c-4970-b61d-5e334d73e842-kube-api-access-q5w2j\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:48.400380 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.400136 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63aab03a-546c-4970-b61d-5e334d73e842-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:22:48.782801 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.782766 2575 generic.go:358] "Generic (PLEG): container finished" podID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerID="560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b" exitCode=0 Apr 23 18:22:48.783265 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.782852 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" Apr 23 18:22:48.783265 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.782862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" event={"ID":"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b","Type":"ContainerDied","Data":"560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b"} Apr 23 18:22:48.783265 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.782933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8" event={"ID":"749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b","Type":"ContainerDied","Data":"b292545803b860f6e0de3043cfdbca8bdbe543544602576b167f7a0484b5572d"} Apr 23 18:22:48.783265 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.782958 2575 scope.go:117] "RemoveContainer" containerID="4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96" Apr 23 18:22:48.784462 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.784437 2575 generic.go:358] "Generic (PLEG): container finished" podID="63aab03a-546c-4970-b61d-5e334d73e842" containerID="b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be" exitCode=0 Apr 23 18:22:48.784571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.784534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" event={"ID":"63aab03a-546c-4970-b61d-5e334d73e842","Type":"ContainerDied","Data":"b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be"} Apr 23 18:22:48.784571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.784568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" event={"ID":"63aab03a-546c-4970-b61d-5e334d73e842","Type":"ContainerDied","Data":"a4ee6d406bee5c44ad3217c19ed3fc23f8edc67b6a0edb51a2d53e602dfc30cc"} Apr 23 18:22:48.784656 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.784546 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4" Apr 23 18:22:48.798954 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.798927 2575 scope.go:117] "RemoveContainer" containerID="560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b" Apr 23 18:22:48.806942 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.806919 2575 scope.go:117] "RemoveContainer" containerID="4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96" Apr 23 18:22:48.807199 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:48.807182 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96\": container with ID starting with 4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96 not found: ID does not exist" containerID="4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96" Apr 23 18:22:48.807250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.807208 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96"} err="failed to get container status \"4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96\": rpc error: code = NotFound desc = could not find container \"4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96\": container with ID starting with 4d9d4054ac0ff3604aeeb5daeb00f36e1de9c7c188e07790030f6c628b11fd96 not found: ID does not exist" Apr 23 18:22:48.807250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.807225 2575 scope.go:117] "RemoveContainer" containerID="560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b" Apr 23 18:22:48.807438 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:48.807420 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b\": container with ID starting with 560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b not found: ID does not exist" containerID="560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b" Apr 23 18:22:48.807476 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.807442 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b"} err="failed to get container status \"560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b\": rpc error: code = NotFound desc = could not find container \"560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b\": container with ID starting with 560a87529d9d52048a02a37039400c027138ee891533170669b685a46f9ef00b not found: ID does not exist" Apr 23 18:22:48.807476 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.807455 2575 scope.go:117] "RemoveContainer" containerID="894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68" Apr 23 18:22:48.817084 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.817052 2575 scope.go:117] "RemoveContainer" containerID="b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be" Apr 23 18:22:48.819431 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.819410 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8"] Apr 23 18:22:48.825595 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.825571 2575 scope.go:117] "RemoveContainer" containerID="894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68" Apr 23 18:22:48.825861 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:48.825839 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68\": container with ID starting with 894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68 not found: ID does not exist" containerID="894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68" Apr 23 18:22:48.825941 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.825872 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68"} err="failed to get container status \"894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68\": rpc error: code = NotFound desc = could not find container \"894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68\": container with ID starting with 894c6900ea49d343e5690bf87548adc41e6bead81132b536eb9d719a40c92f68 not found: ID does not exist" Apr 23 18:22:48.825941 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.825920 2575 scope.go:117] "RemoveContainer" containerID="b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be" Apr 23 18:22:48.826186 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:22:48.826169 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be\": container with ID starting with b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be not found: ID does not exist" containerID="b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be" Apr 23 18:22:48.826230 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.826192 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be"} err="failed to get container status \"b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be\": rpc error: code = NotFound desc = could not find container \"b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be\": container with ID starting with b7b6aeffe4054d45b01cd7e7aeee66196121c9d253065c2ad7335a83075065be not found: ID does not exist" Apr 23 18:22:48.831197 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.831166 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89c6b-predictor-dc9b564c7-qwvv8"] Apr 23 18:22:48.844580 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.844556 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4"] Apr 23 18:22:48.854661 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:48.854638 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89c6b-predictor-9f5f766-8s6w4"] Apr 23 18:22:49.061947 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:49.061856 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63aab03a-546c-4970-b61d-5e334d73e842" path="/var/lib/kubelet/pods/63aab03a-546c-4970-b61d-5e334d73e842/volumes" Apr 23 18:22:49.062546 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:49.062523 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" path="/var/lib/kubelet/pods/749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b/volumes" Apr 23 18:22:51.778679 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:51.778651 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:22:51.779265 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:51.779236 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:22:52.783211 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:52.783185 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:22:52.783639 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:22:52.783609 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:23:01.780046 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:01.780003 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:23:02.783873 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:02.783831 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:23:08.863367 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.863240 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj"] Apr 23 18:23:08.863787 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.863635 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" containerID="cri-o://01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34" gracePeriod=30 Apr 23 18:23:08.863865 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.863781 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kube-rbac-proxy" containerID="cri-o://ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1" gracePeriod=30 Apr 23 18:23:08.918354 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.918318 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f"] Apr 23 18:23:08.918751 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.918695 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" containerID="cri-o://e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1" gracePeriod=30 Apr 23 18:23:08.918751 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.918712 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kube-rbac-proxy" containerID="cri-o://39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be" gracePeriod=30 Apr 23 18:23:08.944484 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944459 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9"] Apr 23 18:23:08.944819 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944807 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kube-rbac-proxy" Apr 23 18:23:08.944857 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944821 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kube-rbac-proxy" Apr 23 18:23:08.944857 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944834 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" Apr 23 18:23:08.944857 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944840 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" Apr 23 18:23:08.944857 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944847 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kube-rbac-proxy" Apr 23 18:23:08.944857 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944853 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kube-rbac-proxy" Apr 23 18:23:08.945047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944875 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" Apr 23 18:23:08.945047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944912 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" Apr 23 18:23:08.945047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944966 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kube-rbac-proxy" Apr 23 18:23:08.945047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944975 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kserve-container" Apr 23 18:23:08.945047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944982 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="749e09bb-9d8f-4c6f-bf1c-12b7d3f4515b" containerName="kserve-container" Apr 23 18:23:08.945047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.944988 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="63aab03a-546c-4970-b61d-5e334d73e842" containerName="kube-rbac-proxy" Apr 23 18:23:08.949512 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.949495 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:08.951556 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.951521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-83978-predictor-serving-cert\"" Apr 23 18:23:08.951638 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.951539 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-83978-kube-rbac-proxy-sar-config\"" Apr 23 18:23:08.957217 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:08.957192 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9"] Apr 23 18:23:09.045087 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.045052 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj"] Apr 23 18:23:09.048526 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.048509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.050522 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.050505 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-83978-predictor-serving-cert\"" Apr 23 18:23:09.050602 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.050520 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-83978-kube-rbac-proxy-sar-config\"" Apr 23 18:23:09.062902 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.062861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj"] Apr 23 18:23:09.089426 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.089397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80140f60-c462-4ad3-9313-f9bee99889bf-proxy-tls\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.089426 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.089434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80140f60-c462-4ad3-9313-f9bee99889bf-success-200-isvc-83978-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.089638 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.089462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz6pg\" (UniqueName: \"kubernetes.io/projected/80140f60-c462-4ad3-9313-f9bee99889bf-kube-api-access-lz6pg\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.190927 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.190866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80140f60-c462-4ad3-9313-f9bee99889bf-proxy-tls\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.191113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.190950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzpzv\" (UniqueName: \"kubernetes.io/projected/f49c56ec-2bbb-4583-ae1f-9c83bd399228-kube-api-access-kzpzv\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.191113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.190990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80140f60-c462-4ad3-9313-f9bee99889bf-success-200-isvc-83978-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.191113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.191047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz6pg\" (UniqueName: \"kubernetes.io/projected/80140f60-c462-4ad3-9313-f9bee99889bf-kube-api-access-lz6pg\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.191113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.191075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c56ec-2bbb-4583-ae1f-9c83bd399228-proxy-tls\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.191345 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.191153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f49c56ec-2bbb-4583-ae1f-9c83bd399228-error-404-isvc-83978-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.191915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.191868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80140f60-c462-4ad3-9313-f9bee99889bf-success-200-isvc-83978-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.193458 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.193435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80140f60-c462-4ad3-9313-f9bee99889bf-proxy-tls\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.200221 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.200006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz6pg\" (UniqueName: \"kubernetes.io/projected/80140f60-c462-4ad3-9313-f9bee99889bf-kube-api-access-lz6pg\") pod \"success-200-isvc-83978-predictor-6788576868-g8bz9\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.261020 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.260973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.292183 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.292154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f49c56ec-2bbb-4583-ae1f-9c83bd399228-error-404-isvc-83978-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.292348 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.292249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzpzv\" (UniqueName: \"kubernetes.io/projected/f49c56ec-2bbb-4583-ae1f-9c83bd399228-kube-api-access-kzpzv\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.292348 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.292291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c56ec-2bbb-4583-ae1f-9c83bd399228-proxy-tls\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.292826 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.292805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f49c56ec-2bbb-4583-ae1f-9c83bd399228-error-404-isvc-83978-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.294970 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.294949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c56ec-2bbb-4583-ae1f-9c83bd399228-proxy-tls\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.301066 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.301044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzpzv\" (UniqueName: \"kubernetes.io/projected/f49c56ec-2bbb-4583-ae1f-9c83bd399228-kube-api-access-kzpzv\") pod \"error-404-isvc-83978-predictor-775f4c4947-v4grj\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.360448 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.360413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.391460 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.391434 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9"] Apr 23 18:23:09.394715 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:23:09.394687 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80140f60_c462_4ad3_9313_f9bee99889bf.slice/crio-3eb339035125544d13c6f17e2eaa3619cb176d724a23660e943ba261f18737d1 WatchSource:0}: Error finding container 3eb339035125544d13c6f17e2eaa3619cb176d724a23660e943ba261f18737d1: Status 404 returned error can't find the container with id 3eb339035125544d13c6f17e2eaa3619cb176d724a23660e943ba261f18737d1 Apr 23 18:23:09.500596 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.500571 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj"] Apr 23 18:23:09.517261 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:23:09.517231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49c56ec_2bbb_4583_ae1f_9c83bd399228.slice/crio-e45527932ab761a722f5b0eb941fcfa1d86b406de3394a8671e49ba4f4ba6bd5 WatchSource:0}: Error finding container e45527932ab761a722f5b0eb941fcfa1d86b406de3394a8671e49ba4f4ba6bd5: Status 404 returned error can't find the container with id e45527932ab761a722f5b0eb941fcfa1d86b406de3394a8671e49ba4f4ba6bd5 Apr 23 18:23:09.860218 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.860104 2575 generic.go:358] "Generic (PLEG): container finished" podID="93f60042-dc14-4c57-a360-01d9ac01b871" containerID="ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1" exitCode=2 Apr 23 18:23:09.860218 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.860194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" event={"ID":"93f60042-dc14-4c57-a360-01d9ac01b871","Type":"ContainerDied","Data":"ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1"} Apr 23 18:23:09.862707 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.862673 2575 generic.go:358] "Generic (PLEG): container finished" podID="f904c9ba-5277-42f1-8335-104599745997" containerID="39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be" exitCode=2 Apr 23 18:23:09.863178 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.862903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" event={"ID":"f904c9ba-5277-42f1-8335-104599745997","Type":"ContainerDied","Data":"39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be"} Apr 23 18:23:09.867012 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.866963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" event={"ID":"f49c56ec-2bbb-4583-ae1f-9c83bd399228","Type":"ContainerStarted","Data":"3c45b0fb8ed84344c0de97097d93605afdfcfa346d02e037acfc632e97567ee4"} Apr 23 18:23:09.867394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.867027 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.867394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.867046 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:09.867394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.867057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" event={"ID":"f49c56ec-2bbb-4583-ae1f-9c83bd399228","Type":"ContainerStarted","Data":"dc2e0823da203a7dc520e959949af619611aa7ffd3d69774115d93e5a374a96f"} Apr 23 18:23:09.867394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.867071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" event={"ID":"f49c56ec-2bbb-4583-ae1f-9c83bd399228","Type":"ContainerStarted","Data":"e45527932ab761a722f5b0eb941fcfa1d86b406de3394a8671e49ba4f4ba6bd5"} Apr 23 18:23:09.867992 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.867961 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:09.869029 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.869004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" event={"ID":"80140f60-c462-4ad3-9313-f9bee99889bf","Type":"ContainerStarted","Data":"3983b329912c85215a5430de04304b10f1add9b79648dba9ff7997eb57de6d03"} Apr 23 18:23:09.869103 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.869034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" event={"ID":"80140f60-c462-4ad3-9313-f9bee99889bf","Type":"ContainerStarted","Data":"38b3646e1c5ca81304c6e8aaf6ddbbd779cd7f4331cfa1e6b93173304aefb42e"} Apr 23 18:23:09.869103 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.869044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" event={"ID":"80140f60-c462-4ad3-9313-f9bee99889bf","Type":"ContainerStarted","Data":"3eb339035125544d13c6f17e2eaa3619cb176d724a23660e943ba261f18737d1"} Apr 23 18:23:09.869250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.869231 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:09.882621 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.882563 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podStartSLOduration=0.882545562 podStartE2EDuration="882.545562ms" podCreationTimestamp="2026-04-23 18:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:23:09.882066487 +0000 UTC m=+1777.428041867" watchObservedRunningTime="2026-04-23 18:23:09.882545562 +0000 UTC m=+1777.428520931" Apr 23 18:23:09.898789 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:09.898737 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podStartSLOduration=1.8987215210000001 podStartE2EDuration="1.898721521s" podCreationTimestamp="2026-04-23 18:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:23:09.897363299 +0000 UTC m=+1777.443338701" watchObservedRunningTime="2026-04-23 18:23:09.898721521 +0000 UTC m=+1777.444696887" Apr 23 18:23:10.604656 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:10.604612 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 23 18:23:10.873291 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:10.873196 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:10.873707 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:10.873378 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:10.874760 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:10.874730 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:23:11.779299 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:11.779260 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:23:11.876376 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:11.876332 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:23:12.316497 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.316472 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:23:12.420214 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.420123 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgxfv\" (UniqueName: \"kubernetes.io/projected/93f60042-dc14-4c57-a360-01d9ac01b871-kube-api-access-zgxfv\") pod \"93f60042-dc14-4c57-a360-01d9ac01b871\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " Apr 23 18:23:12.420214 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.420185 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93f60042-dc14-4c57-a360-01d9ac01b871-error-404-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"93f60042-dc14-4c57-a360-01d9ac01b871\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " Apr 23 18:23:12.420522 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.420227 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93f60042-dc14-4c57-a360-01d9ac01b871-proxy-tls\") pod \"93f60042-dc14-4c57-a360-01d9ac01b871\" (UID: \"93f60042-dc14-4c57-a360-01d9ac01b871\") " Apr 23 18:23:12.420577 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.420552 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f60042-dc14-4c57-a360-01d9ac01b871-error-404-isvc-5d865-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-5d865-kube-rbac-proxy-sar-config") pod "93f60042-dc14-4c57-a360-01d9ac01b871" (UID: "93f60042-dc14-4c57-a360-01d9ac01b871"). InnerVolumeSpecName "error-404-isvc-5d865-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:23:12.422426 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.422394 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f60042-dc14-4c57-a360-01d9ac01b871-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "93f60042-dc14-4c57-a360-01d9ac01b871" (UID: "93f60042-dc14-4c57-a360-01d9ac01b871"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:23:12.422538 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.422485 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f60042-dc14-4c57-a360-01d9ac01b871-kube-api-access-zgxfv" (OuterVolumeSpecName: "kube-api-access-zgxfv") pod "93f60042-dc14-4c57-a360-01d9ac01b871" (UID: "93f60042-dc14-4c57-a360-01d9ac01b871"). InnerVolumeSpecName "kube-api-access-zgxfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:23:12.521439 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.521395 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgxfv\" (UniqueName: \"kubernetes.io/projected/93f60042-dc14-4c57-a360-01d9ac01b871-kube-api-access-zgxfv\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:23:12.521439 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.521434 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93f60042-dc14-4c57-a360-01d9ac01b871-error-404-isvc-5d865-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:23:12.521633 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.521450 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93f60042-dc14-4c57-a360-01d9ac01b871-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:23:12.615099 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.615046 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 23 18:23:12.784250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.784155 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:23:12.883900 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.883855 2575 generic.go:358] "Generic (PLEG): container finished" podID="93f60042-dc14-4c57-a360-01d9ac01b871" containerID="01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34" exitCode=0 Apr 23 18:23:12.884206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.883936 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" Apr 23 18:23:12.884206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.883928 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" event={"ID":"93f60042-dc14-4c57-a360-01d9ac01b871","Type":"ContainerDied","Data":"01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34"} Apr 23 18:23:12.884206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.884052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj" event={"ID":"93f60042-dc14-4c57-a360-01d9ac01b871","Type":"ContainerDied","Data":"dae771d89753707d38f1384a56d08a9803d677747008166ad3fc8c049dd145fb"} Apr 23 18:23:12.884206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.884072 2575 scope.go:117] "RemoveContainer" containerID="ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1" Apr 23 18:23:12.929704 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.929678 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj"] Apr 23 18:23:12.931310 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.931296 2575 scope.go:117] "RemoveContainer" containerID="01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34" Apr 23 18:23:12.934541 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.934512 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5d865-predictor-5bf4578b77-phhwj"] Apr 23 18:23:12.940673 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.940653 2575 scope.go:117] "RemoveContainer" containerID="ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1" Apr 23 18:23:12.941010 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:23:12.940985 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1\": container with ID starting with ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1 not found: ID does not exist" containerID="ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1" Apr 23 18:23:12.941082 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.941020 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1"} err="failed to get container status \"ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1\": rpc error: code = NotFound desc = could not find container \"ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1\": container with ID starting with ae356d408f369ab47a4bf11ad103f6d59665b22b8b79e429d5cb329157e5b2a1 not found: ID does not exist" Apr 23 18:23:12.941082 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.941040 2575 scope.go:117] "RemoveContainer" containerID="01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34" Apr 23 18:23:12.941331 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:23:12.941311 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34\": container with ID starting with 01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34 not found: ID does not exist" containerID="01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34" Apr 23 18:23:12.941373 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:12.941346 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34"} err="failed to get container status \"01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34\": rpc error: code = NotFound desc = could not find container \"01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34\": container with ID starting with 01a33f8ff16e50e646b66acd790c63b3d75b0797636d50fcbc4b18048c56df34 not found: ID does not exist" Apr 23 18:23:13.060217 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.059852 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:23:13.063051 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.063020 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" path="/var/lib/kubelet/pods/93f60042-dc14-4c57-a360-01d9ac01b871/volumes" Apr 23 18:23:13.126208 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.126172 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904c9ba-5277-42f1-8335-104599745997-success-200-isvc-5d865-kube-rbac-proxy-sar-config\") pod \"f904c9ba-5277-42f1-8335-104599745997\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " Apr 23 18:23:13.126344 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.126237 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5xr\" (UniqueName: \"kubernetes.io/projected/f904c9ba-5277-42f1-8335-104599745997-kube-api-access-ql5xr\") pod \"f904c9ba-5277-42f1-8335-104599745997\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " Apr 23 18:23:13.126459 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.126438 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls\") pod \"f904c9ba-5277-42f1-8335-104599745997\" (UID: \"f904c9ba-5277-42f1-8335-104599745997\") " Apr 23 18:23:13.126546 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.126523 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f904c9ba-5277-42f1-8335-104599745997-success-200-isvc-5d865-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-5d865-kube-rbac-proxy-sar-config") pod "f904c9ba-5277-42f1-8335-104599745997" (UID: "f904c9ba-5277-42f1-8335-104599745997"). InnerVolumeSpecName "success-200-isvc-5d865-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:23:13.126719 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.126693 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-5d865-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904c9ba-5277-42f1-8335-104599745997-success-200-isvc-5d865-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:23:13.128386 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.128364 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f904c9ba-5277-42f1-8335-104599745997-kube-api-access-ql5xr" (OuterVolumeSpecName: "kube-api-access-ql5xr") pod "f904c9ba-5277-42f1-8335-104599745997" (UID: "f904c9ba-5277-42f1-8335-104599745997"). InnerVolumeSpecName "kube-api-access-ql5xr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:23:13.128478 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.128461 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f904c9ba-5277-42f1-8335-104599745997" (UID: "f904c9ba-5277-42f1-8335-104599745997"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:23:13.227764 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.227731 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ql5xr\" (UniqueName: \"kubernetes.io/projected/f904c9ba-5277-42f1-8335-104599745997-kube-api-access-ql5xr\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:23:13.227764 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.227762 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904c9ba-5277-42f1-8335-104599745997-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:23:13.889393 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.889358 2575 generic.go:358] "Generic (PLEG): container finished" podID="f904c9ba-5277-42f1-8335-104599745997" containerID="e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1" exitCode=0 Apr 23 18:23:13.889801 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.889408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" event={"ID":"f904c9ba-5277-42f1-8335-104599745997","Type":"ContainerDied","Data":"e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1"} Apr 23 18:23:13.889801 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.889436 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" Apr 23 18:23:13.889801 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.889453 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f" event={"ID":"f904c9ba-5277-42f1-8335-104599745997","Type":"ContainerDied","Data":"d13d0d1d68dbd71ad94af43344a335523ac0f428f03ecbe844aa20360c6cae58"} Apr 23 18:23:13.889801 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.889474 2575 scope.go:117] "RemoveContainer" containerID="39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be" Apr 23 18:23:13.903418 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.903298 2575 scope.go:117] "RemoveContainer" containerID="e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1" Apr 23 18:23:13.910539 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.910517 2575 scope.go:117] "RemoveContainer" containerID="39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be" Apr 23 18:23:13.910791 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:23:13.910775 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be\": container with ID starting with 39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be not found: ID does not exist" containerID="39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be" Apr 23 18:23:13.910842 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.910798 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be"} err="failed to get container status \"39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be\": rpc error: code = NotFound desc = could not find container \"39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be\": container with ID starting with 39a1f6932057d5a77315859bb720a742895085b282406928de5712ebd0ca97be not found: ID does not exist" Apr 23 18:23:13.910842 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.910813 2575 scope.go:117] "RemoveContainer" containerID="e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1" Apr 23 18:23:13.911049 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:23:13.911028 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1\": container with ID starting with e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1 not found: ID does not exist" containerID="e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1" Apr 23 18:23:13.911112 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.911058 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1"} err="failed to get container status \"e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1\": rpc error: code = NotFound desc = could not find container \"e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1\": container with ID starting with e74d7157d058820d17e77936ca7aa935fbc4136a9924f4affe71cf86f1789bb1 not found: ID does not exist" Apr 23 18:23:13.914272 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.914250 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f"] Apr 23 18:23:13.917977 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:13.917956 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5d865-predictor-6cddf5c7f8-6dm2f"] Apr 23 18:23:15.063746 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:15.063706 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f904c9ba-5277-42f1-8335-104599745997" path="/var/lib/kubelet/pods/f904c9ba-5277-42f1-8335-104599745997/volumes" Apr 23 18:23:15.878081 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:15.878057 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:23:15.878645 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:15.878620 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:16.880575 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:16.880538 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:23:16.881115 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:16.881090 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:23:21.779171 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:21.779128 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:23:22.784385 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:22.784349 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:23:25.879193 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:25.879150 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:26.881676 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:26.881631 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:23:31.779870 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:31.779823 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:23:32.783968 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:32.783928 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:23:33.081789 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:33.081697 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:23:33.087236 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:33.087210 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:23:35.879469 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:35.879424 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:36.881784 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:36.881739 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:23:41.779707 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:41.779679 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:23:42.784858 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:42.784827 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:23:45.878570 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:45.878527 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:46.881397 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:46.881361 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:23:55.879454 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:55.879416 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:23:56.882033 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:23:56.882002 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:24:04.740329 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.740288 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g"] Apr 23 18:24:04.740710 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.740590 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" containerID="cri-o://dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd" gracePeriod=30 Apr 23 18:24:04.740710 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.740664 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kube-rbac-proxy" containerID="cri-o://06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3" gracePeriod=30 Apr 23 18:24:04.779159 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779124 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79"] Apr 23 18:24:04.779528 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779516 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kube-rbac-proxy" Apr 23 18:24:04.779571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779530 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kube-rbac-proxy" Apr 23 18:24:04.779571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779539 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" Apr 23 18:24:04.779571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779544 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" Apr 23 18:24:04.779571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779559 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" Apr 23 18:24:04.779571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779564 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" Apr 23 18:24:04.779733 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779576 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kube-rbac-proxy" Apr 23 18:24:04.779733 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779581 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kube-rbac-proxy" Apr 23 18:24:04.779733 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779637 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kserve-container" Apr 23 18:24:04.779733 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779644 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f60042-dc14-4c57-a360-01d9ac01b871" containerName="kube-rbac-proxy" Apr 23 18:24:04.779733 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779651 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kserve-container" Apr 23 18:24:04.779733 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.779662 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f904c9ba-5277-42f1-8335-104599745997" containerName="kube-rbac-proxy" Apr 23 18:24:04.783044 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.783019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.787482 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.787275 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-02052-predictor-serving-cert\"" Apr 23 18:24:04.787614 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.787521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-02052-kube-rbac-proxy-sar-config\"" Apr 23 18:24:04.793402 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.792600 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79"] Apr 23 18:24:04.815854 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.815818 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb"] Apr 23 18:24:04.816273 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.816218 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" containerID="cri-o://8a2727c541532cd880be8748634aeda3995fbf7b047624eaa551cf15481e0588" gracePeriod=30 Apr 23 18:24:04.816402 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.816273 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kube-rbac-proxy" containerID="cri-o://3ca14810ec4ee79d66dc34cfd76b87e3e0408fd904ce449b66eef33be194e713" gracePeriod=30 Apr 23 18:24:04.871312 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.871280 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd"] Apr 23 18:24:04.874649 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.874632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:04.876506 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.876489 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-02052-predictor-serving-cert\"" Apr 23 18:24:04.876589 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.876513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-02052-kube-rbac-proxy-sar-config\"" Apr 23 18:24:04.883505 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.883480 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd"] Apr 23 18:24:04.884724 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.884702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c39b169b-ef78-41d1-a7ac-de0e3599683c-success-200-isvc-02052-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.884837 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.884779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c39b169b-ef78-41d1-a7ac-de0e3599683c-proxy-tls\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.884928 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.884899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sls78\" (UniqueName: \"kubernetes.io/projected/c39b169b-ef78-41d1-a7ac-de0e3599683c-kube-api-access-sls78\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.986143 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sls78\" (UniqueName: \"kubernetes.io/projected/c39b169b-ef78-41d1-a7ac-de0e3599683c-kube-api-access-sls78\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.986332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c39b169b-ef78-41d1-a7ac-de0e3599683c-success-200-isvc-02052-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.986332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04b6319-9573-4718-96c5-c45ad07a9a8f-proxy-tls\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:04.986332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp5x2\" (UniqueName: \"kubernetes.io/projected/a04b6319-9573-4718-96c5-c45ad07a9a8f-kube-api-access-qp5x2\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:04.986332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c39b169b-ef78-41d1-a7ac-de0e3599683c-proxy-tls\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.986530 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04b6319-9573-4718-96c5-c45ad07a9a8f-error-404-isvc-02052-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:04.986931 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.986901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c39b169b-ef78-41d1-a7ac-de0e3599683c-success-200-isvc-02052-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.988744 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.988718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c39b169b-ef78-41d1-a7ac-de0e3599683c-proxy-tls\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:04.994556 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:04.994507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sls78\" (UniqueName: \"kubernetes.io/projected/c39b169b-ef78-41d1-a7ac-de0e3599683c-kube-api-access-sls78\") pod \"success-200-isvc-02052-predictor-7f966b6f9b-7bm79\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:05.069480 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.069441 2575 generic.go:358] "Generic (PLEG): container finished" podID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerID="3ca14810ec4ee79d66dc34cfd76b87e3e0408fd904ce449b66eef33be194e713" exitCode=2 Apr 23 18:24:05.069670 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.069505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" event={"ID":"5227b4d9-66cc-41bb-959c-f3591b46323d","Type":"ContainerDied","Data":"3ca14810ec4ee79d66dc34cfd76b87e3e0408fd904ce449b66eef33be194e713"} Apr 23 18:24:05.071077 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.071056 2575 generic.go:358] "Generic (PLEG): container finished" podID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerID="06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3" exitCode=2 Apr 23 18:24:05.071186 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.071107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" event={"ID":"30ec9c27-aca5-42c9-ac1d-c7c522cdf327","Type":"ContainerDied","Data":"06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3"} Apr 23 18:24:05.087583 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.087562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04b6319-9573-4718-96c5-c45ad07a9a8f-error-404-isvc-02052-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.087682 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.087619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04b6319-9573-4718-96c5-c45ad07a9a8f-proxy-tls\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.087751 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.087734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp5x2\" (UniqueName: \"kubernetes.io/projected/a04b6319-9573-4718-96c5-c45ad07a9a8f-kube-api-access-qp5x2\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.088222 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.088200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04b6319-9573-4718-96c5-c45ad07a9a8f-error-404-isvc-02052-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.090222 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.090203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04b6319-9573-4718-96c5-c45ad07a9a8f-proxy-tls\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.096039 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.096015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp5x2\" (UniqueName: \"kubernetes.io/projected/a04b6319-9573-4718-96c5-c45ad07a9a8f-kube-api-access-qp5x2\") pod \"error-404-isvc-02052-predictor-7ff4bf9679-cdttd\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.096716 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.096692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:05.186164 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.186120 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:05.239918 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.239064 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79"] Apr 23 18:24:05.242641 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:24:05.242608 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39b169b_ef78_41d1_a7ac_de0e3599683c.slice/crio-4e41d36f6ef42a21b90533b2f932550a3735093707038662d24fc7bff207f639 WatchSource:0}: Error finding container 4e41d36f6ef42a21b90533b2f932550a3735093707038662d24fc7bff207f639: Status 404 returned error can't find the container with id 4e41d36f6ef42a21b90533b2f932550a3735093707038662d24fc7bff207f639 Apr 23 18:24:05.371251 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.371146 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd"] Apr 23 18:24:05.390085 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:24:05.390044 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04b6319_9573_4718_96c5_c45ad07a9a8f.slice/crio-3277b63d3d61bcef1a11d9d14c39daeecce987b61458bdde045e27b7bd2a73a5 WatchSource:0}: Error finding container 3277b63d3d61bcef1a11d9d14c39daeecce987b61458bdde045e27b7bd2a73a5: Status 404 returned error can't find the container with id 3277b63d3d61bcef1a11d9d14c39daeecce987b61458bdde045e27b7bd2a73a5 Apr 23 18:24:05.879116 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:05.879034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:24:06.076364 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.076322 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" event={"ID":"c39b169b-ef78-41d1-a7ac-de0e3599683c","Type":"ContainerStarted","Data":"bafddf275f156ee8b5096b21ec69d2d05cfb14a194301e9a732fc71ed578b1fc"} Apr 23 18:24:06.076532 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.076371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" event={"ID":"c39b169b-ef78-41d1-a7ac-de0e3599683c","Type":"ContainerStarted","Data":"0ac627c61741a6b281f85397fbc2656ea25d6282c559f058258e8fb66fc8dc27"} Apr 23 18:24:06.076532 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.076390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" event={"ID":"c39b169b-ef78-41d1-a7ac-de0e3599683c","Type":"ContainerStarted","Data":"4e41d36f6ef42a21b90533b2f932550a3735093707038662d24fc7bff207f639"} Apr 23 18:24:06.076688 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.076573 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:06.076746 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.076701 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:06.077987 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.077959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" event={"ID":"a04b6319-9573-4718-96c5-c45ad07a9a8f","Type":"ContainerStarted","Data":"a4e8ad19a03d98f67a33d441d665df11283b294c125d85320094bf9e5cdcb349"} Apr 23 18:24:06.078114 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.077991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" event={"ID":"a04b6319-9573-4718-96c5-c45ad07a9a8f","Type":"ContainerStarted","Data":"d776827b2121e7d90bee610ac8f4c2ce66a598d4b650365e6313b046cdfd6b2a"} Apr 23 18:24:06.078114 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.078014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" event={"ID":"a04b6319-9573-4718-96c5-c45ad07a9a8f","Type":"ContainerStarted","Data":"3277b63d3d61bcef1a11d9d14c39daeecce987b61458bdde045e27b7bd2a73a5"} Apr 23 18:24:06.078201 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.078119 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:06.078399 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.078378 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:06.092999 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.092951 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podStartSLOduration=2.092936693 podStartE2EDuration="2.092936693s" podCreationTimestamp="2026-04-23 18:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:24:06.091934837 +0000 UTC m=+1833.637910203" watchObservedRunningTime="2026-04-23 18:24:06.092936693 +0000 UTC m=+1833.638912061" Apr 23 18:24:06.108199 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.108158 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podStartSLOduration=2.108145551 podStartE2EDuration="2.108145551s" podCreationTimestamp="2026-04-23 18:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:24:06.106210434 +0000 UTC m=+1833.652185801" watchObservedRunningTime="2026-04-23 18:24:06.108145551 +0000 UTC m=+1833.654120918" Apr 23 18:24:06.775374 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:06.775335 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 23 18:24:07.081579 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:07.081478 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:07.082010 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:07.081660 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:07.082854 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:07.082827 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:24:07.779045 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:07.778988 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 23 18:24:08.087757 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.087727 2575 generic.go:358] "Generic (PLEG): container finished" podID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerID="8a2727c541532cd880be8748634aeda3995fbf7b047624eaa551cf15481e0588" exitCode=0 Apr 23 18:24:08.088178 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.087758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" event={"ID":"5227b4d9-66cc-41bb-959c-f3591b46323d","Type":"ContainerDied","Data":"8a2727c541532cd880be8748634aeda3995fbf7b047624eaa551cf15481e0588"} Apr 23 18:24:08.088254 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.088166 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:24:08.167564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.167538 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:24:08.320134 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.320092 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4f7x\" (UniqueName: \"kubernetes.io/projected/5227b4d9-66cc-41bb-959c-f3591b46323d-kube-api-access-f4f7x\") pod \"5227b4d9-66cc-41bb-959c-f3591b46323d\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " Apr 23 18:24:08.320307 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.320145 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5227b4d9-66cc-41bb-959c-f3591b46323d-error-404-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"5227b4d9-66cc-41bb-959c-f3591b46323d\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " Apr 23 18:24:08.320307 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.320291 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5227b4d9-66cc-41bb-959c-f3591b46323d-proxy-tls\") pod \"5227b4d9-66cc-41bb-959c-f3591b46323d\" (UID: \"5227b4d9-66cc-41bb-959c-f3591b46323d\") " Apr 23 18:24:08.320578 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.320546 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5227b4d9-66cc-41bb-959c-f3591b46323d-error-404-isvc-89ff7-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-89ff7-kube-rbac-proxy-sar-config") pod "5227b4d9-66cc-41bb-959c-f3591b46323d" (UID: "5227b4d9-66cc-41bb-959c-f3591b46323d"). InnerVolumeSpecName "error-404-isvc-89ff7-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:24:08.322501 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.322477 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5227b4d9-66cc-41bb-959c-f3591b46323d-kube-api-access-f4f7x" (OuterVolumeSpecName: "kube-api-access-f4f7x") pod "5227b4d9-66cc-41bb-959c-f3591b46323d" (UID: "5227b4d9-66cc-41bb-959c-f3591b46323d"). InnerVolumeSpecName "kube-api-access-f4f7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:24:08.322570 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.322538 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5227b4d9-66cc-41bb-959c-f3591b46323d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5227b4d9-66cc-41bb-959c-f3591b46323d" (UID: "5227b4d9-66cc-41bb-959c-f3591b46323d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:24:08.421042 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.421011 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5227b4d9-66cc-41bb-959c-f3591b46323d-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:24:08.421042 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.421041 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4f7x\" (UniqueName: \"kubernetes.io/projected/5227b4d9-66cc-41bb-959c-f3591b46323d-kube-api-access-f4f7x\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:24:08.421231 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.421053 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5227b4d9-66cc-41bb-959c-f3591b46323d-error-404-isvc-89ff7-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:24:08.887920 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:08.886425 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:24:09.027516 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.027416 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-success-200-isvc-89ff7-kube-rbac-proxy-sar-config\") pod \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " Apr 23 18:24:09.027516 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.027473 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjjw6\" (UniqueName: \"kubernetes.io/projected/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-kube-api-access-bjjw6\") pod \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " Apr 23 18:24:09.027516 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.027511 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls\") pod \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\" (UID: \"30ec9c27-aca5-42c9-ac1d-c7c522cdf327\") " Apr 23 18:24:09.027905 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.027850 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-success-200-isvc-89ff7-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-89ff7-kube-rbac-proxy-sar-config") pod "30ec9c27-aca5-42c9-ac1d-c7c522cdf327" (UID: "30ec9c27-aca5-42c9-ac1d-c7c522cdf327"). InnerVolumeSpecName "success-200-isvc-89ff7-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:24:09.029770 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.029738 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "30ec9c27-aca5-42c9-ac1d-c7c522cdf327" (UID: "30ec9c27-aca5-42c9-ac1d-c7c522cdf327"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:24:09.029770 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.029752 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-kube-api-access-bjjw6" (OuterVolumeSpecName: "kube-api-access-bjjw6") pod "30ec9c27-aca5-42c9-ac1d-c7c522cdf327" (UID: "30ec9c27-aca5-42c9-ac1d-c7c522cdf327"). InnerVolumeSpecName "kube-api-access-bjjw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:24:09.092991 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.092939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" event={"ID":"5227b4d9-66cc-41bb-959c-f3591b46323d","Type":"ContainerDied","Data":"4b7eaf2ae11789a999ac4ffc1c77a3bc76d27f84043e7ea4be6425006de115c1"} Apr 23 18:24:09.092991 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.092972 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb" Apr 23 18:24:09.093445 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.093000 2575 scope.go:117] "RemoveContainer" containerID="3ca14810ec4ee79d66dc34cfd76b87e3e0408fd904ce449b66eef33be194e713" Apr 23 18:24:09.095060 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.095030 2575 generic.go:358] "Generic (PLEG): container finished" podID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerID="dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd" exitCode=0 Apr 23 18:24:09.095182 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.095082 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" event={"ID":"30ec9c27-aca5-42c9-ac1d-c7c522cdf327","Type":"ContainerDied","Data":"dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd"} Apr 23 18:24:09.095182 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.095124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" event={"ID":"30ec9c27-aca5-42c9-ac1d-c7c522cdf327","Type":"ContainerDied","Data":"21e764a6a30d44f741909b7188a955a78224f631cd7eb16b74a1dae768c08422"} Apr 23 18:24:09.095304 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.095244 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g" Apr 23 18:24:09.108102 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.108069 2575 scope.go:117] "RemoveContainer" containerID="8a2727c541532cd880be8748634aeda3995fbf7b047624eaa551cf15481e0588" Apr 23 18:24:09.112318 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.112291 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb"] Apr 23 18:24:09.117062 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.117034 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89ff7-predictor-56ff5477b5-w7ffb"] Apr 23 18:24:09.117862 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.117843 2575 scope.go:117] "RemoveContainer" containerID="06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3" Apr 23 18:24:09.125964 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.125943 2575 scope.go:117] "RemoveContainer" containerID="dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd" Apr 23 18:24:09.128564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.128435 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-89ff7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-success-200-isvc-89ff7-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:24:09.128564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.128466 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bjjw6\" (UniqueName: \"kubernetes.io/projected/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-kube-api-access-bjjw6\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:24:09.128564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.128482 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30ec9c27-aca5-42c9-ac1d-c7c522cdf327-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:24:09.130162 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.130140 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g"] Apr 23 18:24:09.134214 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.134190 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-89ff7-predictor-55645f8766-rbr8g"] Apr 23 18:24:09.134701 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.134690 2575 scope.go:117] "RemoveContainer" containerID="06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3" Apr 23 18:24:09.135062 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:24:09.135038 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3\": container with ID starting with 06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3 not found: ID does not exist" containerID="06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3" Apr 23 18:24:09.135151 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.135070 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3"} err="failed to get container status \"06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3\": rpc error: code = NotFound desc = could not find container \"06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3\": container with ID starting with 06ea40db4db363c16ba85bf65641b224faafeb7460dd7167114d4a50fa9cddc3 not found: ID does not exist" Apr 23 18:24:09.135151 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.135090 2575 scope.go:117] "RemoveContainer" containerID="dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd" Apr 23 18:24:09.135327 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:24:09.135312 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd\": container with ID starting with dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd not found: ID does not exist" containerID="dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd" Apr 23 18:24:09.135367 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:09.135330 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd"} err="failed to get container status \"dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd\": rpc error: code = NotFound desc = could not find container \"dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd\": container with ID starting with dc1037c04e08496540e5d5470d48f027dbb5ec5f525582d4a4dd7b1a3146e8bd not found: ID does not exist" Apr 23 18:24:11.062164 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:11.062122 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" path="/var/lib/kubelet/pods/30ec9c27-aca5-42c9-ac1d-c7c522cdf327/volumes" Apr 23 18:24:11.062652 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:11.062633 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" path="/var/lib/kubelet/pods/5227b4d9-66cc-41bb-959c-f3591b46323d/volumes" Apr 23 18:24:12.086165 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:12.086138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:24:12.086728 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:12.086700 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:13.092957 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:13.092926 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:24:13.093564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:13.093534 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:24:22.087556 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:22.087507 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:23.094000 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:23.093959 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:24:32.086655 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:32.086607 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:33.093458 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:33.093416 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:24:42.087407 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:42.087318 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:43.093975 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:43.093936 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:24:52.086762 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:52.086710 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:24:53.094955 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:24:53.094922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:25:02.088011 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:25:02.087978 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:28:33.105992 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:28:33.105960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:28:33.112062 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:28:33.112026 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:32:23.907294 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:23.907216 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9"] Apr 23 18:32:23.907821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:23.907482 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" containerID="cri-o://38b3646e1c5ca81304c6e8aaf6ddbbd779cd7f4331cfa1e6b93173304aefb42e" gracePeriod=30 Apr 23 18:32:23.907821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:23.907526 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kube-rbac-proxy" containerID="cri-o://3983b329912c85215a5430de04304b10f1add9b79648dba9ff7997eb57de6d03" gracePeriod=30 Apr 23 18:32:23.990928 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:23.990892 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj"] Apr 23 18:32:23.991198 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:23.991173 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" containerID="cri-o://dc2e0823da203a7dc520e959949af619611aa7ffd3d69774115d93e5a374a96f" gracePeriod=30 Apr 23 18:32:23.991279 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:23.991227 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kube-rbac-proxy" containerID="cri-o://3c45b0fb8ed84344c0de97097d93605afdfcfa346d02e037acfc632e97567ee4" gracePeriod=30 Apr 23 18:32:24.031164 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031131 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr"] Apr 23 18:32:24.031504 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031492 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031505 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031514 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031520 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031528 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kube-rbac-proxy" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031533 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kube-rbac-proxy" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031540 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kube-rbac-proxy" Apr 23 18:32:24.031558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031545 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kube-rbac-proxy" Apr 23 18:32:24.031811 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031603 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kserve-container" Apr 23 18:32:24.031811 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031611 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="30ec9c27-aca5-42c9-ac1d-c7c522cdf327" containerName="kube-rbac-proxy" Apr 23 18:32:24.031811 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031619 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kserve-container" Apr 23 18:32:24.031811 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.031626 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5227b4d9-66cc-41bb-959c-f3591b46323d" containerName="kube-rbac-proxy" Apr 23 18:32:24.034506 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.034489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.036854 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.036832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-60f5a-kube-rbac-proxy-sar-config\"" Apr 23 18:32:24.037005 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.036835 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-60f5a-predictor-serving-cert\"" Apr 23 18:32:24.062985 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.062944 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr"] Apr 23 18:32:24.121004 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.120974 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7"] Apr 23 18:32:24.124427 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.124405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.125104 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.124684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhqs4\" (UniqueName: \"kubernetes.io/projected/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-kube-api-access-jhqs4\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.125104 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.124860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-success-200-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.125104 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.124932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.126958 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.126936 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-60f5a-predictor-serving-cert\"" Apr 23 18:32:24.127072 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.126998 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-60f5a-kube-rbac-proxy-sar-config\"" Apr 23 18:32:24.136416 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.136395 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7"] Apr 23 18:32:24.225673 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.225644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhqs4\" (UniqueName: \"kubernetes.io/projected/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-kube-api-access-jhqs4\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.225911 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.225689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bff60ad1-536a-4cf2-832c-a469d7aba58b-proxy-tls\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.225911 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.225730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-success-200-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.225911 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.225749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bff60ad1-536a-4cf2-832c-a469d7aba58b-error-404-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.225911 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.225776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.225911 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.225801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhm2\" (UniqueName: \"kubernetes.io/projected/bff60ad1-536a-4cf2-832c-a469d7aba58b-kube-api-access-fwhm2\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.226207 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:32:24.225925 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-serving-cert: secret "success-200-isvc-60f5a-predictor-serving-cert" not found Apr 23 18:32:24.226207 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:32:24.226026 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls podName:a4ae74b8-bf02-4534-b6aa-1a00bcd795a4 nodeName:}" failed. No retries permitted until 2026-04-23 18:32:24.725995997 +0000 UTC m=+2332.271971341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls") pod "success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" (UID: "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4") : secret "success-200-isvc-60f5a-predictor-serving-cert" not found Apr 23 18:32:24.226489 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.226466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-success-200-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.235619 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.235596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhqs4\" (UniqueName: \"kubernetes.io/projected/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-kube-api-access-jhqs4\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.326996 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.326948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bff60ad1-536a-4cf2-832c-a469d7aba58b-error-404-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.327156 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.327037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhm2\" (UniqueName: \"kubernetes.io/projected/bff60ad1-536a-4cf2-832c-a469d7aba58b-kube-api-access-fwhm2\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.327156 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.327121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bff60ad1-536a-4cf2-832c-a469d7aba58b-proxy-tls\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.327700 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.327673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bff60ad1-536a-4cf2-832c-a469d7aba58b-error-404-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.329651 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.329632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bff60ad1-536a-4cf2-832c-a469d7aba58b-proxy-tls\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.336657 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.336631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhm2\" (UniqueName: \"kubernetes.io/projected/bff60ad1-536a-4cf2-832c-a469d7aba58b-kube-api-access-fwhm2\") pod \"error-404-isvc-60f5a-predictor-7646d945df-9jfr7\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.436029 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.435991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.562061 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.562039 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7"] Apr 23 18:32:24.565274 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:32:24.565244 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff60ad1_536a_4cf2_832c_a469d7aba58b.slice/crio-d013013d3fb5c01eeee263d4a6e64394377b81a13970920a44713850ad6f2105 WatchSource:0}: Error finding container d013013d3fb5c01eeee263d4a6e64394377b81a13970920a44713850ad6f2105: Status 404 returned error can't find the container with id d013013d3fb5c01eeee263d4a6e64394377b81a13970920a44713850ad6f2105 Apr 23 18:32:24.567341 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.567321 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:32:24.730852 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.730818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.733318 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.733300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls\") pod \"success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:24.800544 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.800450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" event={"ID":"bff60ad1-536a-4cf2-832c-a469d7aba58b","Type":"ContainerStarted","Data":"d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4"} Apr 23 18:32:24.800544 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.800490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" event={"ID":"bff60ad1-536a-4cf2-832c-a469d7aba58b","Type":"ContainerStarted","Data":"6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7"} Apr 23 18:32:24.800544 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.800503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" event={"ID":"bff60ad1-536a-4cf2-832c-a469d7aba58b","Type":"ContainerStarted","Data":"d013013d3fb5c01eeee263d4a6e64394377b81a13970920a44713850ad6f2105"} Apr 23 18:32:24.800800 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.800585 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:24.802003 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.801982 2575 generic.go:358] "Generic (PLEG): container finished" podID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerID="3c45b0fb8ed84344c0de97097d93605afdfcfa346d02e037acfc632e97567ee4" exitCode=2 Apr 23 18:32:24.802115 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.802044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" event={"ID":"f49c56ec-2bbb-4583-ae1f-9c83bd399228","Type":"ContainerDied","Data":"3c45b0fb8ed84344c0de97097d93605afdfcfa346d02e037acfc632e97567ee4"} Apr 23 18:32:24.803393 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.803373 2575 generic.go:358] "Generic (PLEG): container finished" podID="80140f60-c462-4ad3-9313-f9bee99889bf" containerID="3983b329912c85215a5430de04304b10f1add9b79648dba9ff7997eb57de6d03" exitCode=2 Apr 23 18:32:24.803474 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.803439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" event={"ID":"80140f60-c462-4ad3-9313-f9bee99889bf","Type":"ContainerDied","Data":"3983b329912c85215a5430de04304b10f1add9b79648dba9ff7997eb57de6d03"} Apr 23 18:32:24.820597 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.820555 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podStartSLOduration=0.820543938 podStartE2EDuration="820.543938ms" podCreationTimestamp="2026-04-23 18:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:32:24.819343477 +0000 UTC m=+2332.365318844" watchObservedRunningTime="2026-04-23 18:32:24.820543938 +0000 UTC m=+2332.366519298" Apr 23 18:32:24.944924 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:24.944867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:25.078250 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.078227 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr"] Apr 23 18:32:25.080220 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:32:25.080191 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ae74b8_bf02_4534_b6aa_1a00bcd795a4.slice/crio-70d7a47a96efed957856ac80ef1977c0ec2792f3e1140286c9e6856ae5f082bc WatchSource:0}: Error finding container 70d7a47a96efed957856ac80ef1977c0ec2792f3e1140286c9e6856ae5f082bc: Status 404 returned error can't find the container with id 70d7a47a96efed957856ac80ef1977c0ec2792f3e1140286c9e6856ae5f082bc Apr 23 18:32:25.812194 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.812158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" event={"ID":"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4","Type":"ContainerStarted","Data":"348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba"} Apr 23 18:32:25.812374 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.812202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" event={"ID":"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4","Type":"ContainerStarted","Data":"ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a"} Apr 23 18:32:25.812374 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.812213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" event={"ID":"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4","Type":"ContainerStarted","Data":"70d7a47a96efed957856ac80ef1977c0ec2792f3e1140286c9e6856ae5f082bc"} Apr 23 18:32:25.812557 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.812534 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:25.812654 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.812566 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:25.812654 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.812579 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:25.813703 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.813669 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:32:25.813813 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.813685 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:32:25.831217 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.830915 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podStartSLOduration=1.8308721019999998 podStartE2EDuration="1.830872102s" podCreationTimestamp="2026-04-23 18:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:32:25.830388932 +0000 UTC m=+2333.376364299" watchObservedRunningTime="2026-04-23 18:32:25.830872102 +0000 UTC m=+2333.376847469" Apr 23 18:32:25.874009 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.873965 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 23 18:32:25.879240 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:25.879213 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:32:26.816253 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:26.816216 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:32:26.816759 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:26.816292 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:32:26.876802 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:26.876764 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 23 18:32:26.881176 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:26.881149 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:32:27.821564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.821528 2575 generic.go:358] "Generic (PLEG): container finished" podID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerID="dc2e0823da203a7dc520e959949af619611aa7ffd3d69774115d93e5a374a96f" exitCode=0 Apr 23 18:32:27.821999 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.821623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" event={"ID":"f49c56ec-2bbb-4583-ae1f-9c83bd399228","Type":"ContainerDied","Data":"dc2e0823da203a7dc520e959949af619611aa7ffd3d69774115d93e5a374a96f"} Apr 23 18:32:27.823508 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.823485 2575 generic.go:358] "Generic (PLEG): container finished" podID="80140f60-c462-4ad3-9313-f9bee99889bf" containerID="38b3646e1c5ca81304c6e8aaf6ddbbd779cd7f4331cfa1e6b93173304aefb42e" exitCode=0 Apr 23 18:32:27.823624 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.823559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" event={"ID":"80140f60-c462-4ad3-9313-f9bee99889bf","Type":"ContainerDied","Data":"38b3646e1c5ca81304c6e8aaf6ddbbd779cd7f4331cfa1e6b93173304aefb42e"} Apr 23 18:32:27.854816 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.854795 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:32:27.949031 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.948999 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:32:27.959429 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.959399 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80140f60-c462-4ad3-9313-f9bee99889bf-success-200-isvc-83978-kube-rbac-proxy-sar-config\") pod \"80140f60-c462-4ad3-9313-f9bee99889bf\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " Apr 23 18:32:27.959593 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.959478 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80140f60-c462-4ad3-9313-f9bee99889bf-proxy-tls\") pod \"80140f60-c462-4ad3-9313-f9bee99889bf\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " Apr 23 18:32:27.960181 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.959872 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80140f60-c462-4ad3-9313-f9bee99889bf-success-200-isvc-83978-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-83978-kube-rbac-proxy-sar-config") pod "80140f60-c462-4ad3-9313-f9bee99889bf" (UID: "80140f60-c462-4ad3-9313-f9bee99889bf"). InnerVolumeSpecName "success-200-isvc-83978-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:32:27.960493 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.960314 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz6pg\" (UniqueName: \"kubernetes.io/projected/80140f60-c462-4ad3-9313-f9bee99889bf-kube-api-access-lz6pg\") pod \"80140f60-c462-4ad3-9313-f9bee99889bf\" (UID: \"80140f60-c462-4ad3-9313-f9bee99889bf\") " Apr 23 18:32:27.960813 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.960785 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80140f60-c462-4ad3-9313-f9bee99889bf-success-200-isvc-83978-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:32:27.962463 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.962436 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80140f60-c462-4ad3-9313-f9bee99889bf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "80140f60-c462-4ad3-9313-f9bee99889bf" (UID: "80140f60-c462-4ad3-9313-f9bee99889bf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:32:27.962635 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:27.962620 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80140f60-c462-4ad3-9313-f9bee99889bf-kube-api-access-lz6pg" (OuterVolumeSpecName: "kube-api-access-lz6pg") pod "80140f60-c462-4ad3-9313-f9bee99889bf" (UID: "80140f60-c462-4ad3-9313-f9bee99889bf"). InnerVolumeSpecName "kube-api-access-lz6pg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:32:28.061262 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.061215 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c56ec-2bbb-4583-ae1f-9c83bd399228-proxy-tls\") pod \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " Apr 23 18:32:28.061262 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.061269 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f49c56ec-2bbb-4583-ae1f-9c83bd399228-error-404-isvc-83978-kube-rbac-proxy-sar-config\") pod \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " Apr 23 18:32:28.061512 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.061299 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzpzv\" (UniqueName: \"kubernetes.io/projected/f49c56ec-2bbb-4583-ae1f-9c83bd399228-kube-api-access-kzpzv\") pod \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\" (UID: \"f49c56ec-2bbb-4583-ae1f-9c83bd399228\") " Apr 23 18:32:28.061512 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.061487 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80140f60-c462-4ad3-9313-f9bee99889bf-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:32:28.061512 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.061507 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lz6pg\" (UniqueName: \"kubernetes.io/projected/80140f60-c462-4ad3-9313-f9bee99889bf-kube-api-access-lz6pg\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:32:28.061694 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.061670 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49c56ec-2bbb-4583-ae1f-9c83bd399228-error-404-isvc-83978-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-83978-kube-rbac-proxy-sar-config") pod "f49c56ec-2bbb-4583-ae1f-9c83bd399228" (UID: "f49c56ec-2bbb-4583-ae1f-9c83bd399228"). InnerVolumeSpecName "error-404-isvc-83978-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:32:28.063635 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.063604 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49c56ec-2bbb-4583-ae1f-9c83bd399228-kube-api-access-kzpzv" (OuterVolumeSpecName: "kube-api-access-kzpzv") pod "f49c56ec-2bbb-4583-ae1f-9c83bd399228" (UID: "f49c56ec-2bbb-4583-ae1f-9c83bd399228"). InnerVolumeSpecName "kube-api-access-kzpzv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:32:28.063635 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.063621 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49c56ec-2bbb-4583-ae1f-9c83bd399228-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f49c56ec-2bbb-4583-ae1f-9c83bd399228" (UID: "f49c56ec-2bbb-4583-ae1f-9c83bd399228"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:32:28.161958 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.161919 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49c56ec-2bbb-4583-ae1f-9c83bd399228-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:32:28.161958 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.161947 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-83978-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f49c56ec-2bbb-4583-ae1f-9c83bd399228-error-404-isvc-83978-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:32:28.161958 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.161960 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzpzv\" (UniqueName: \"kubernetes.io/projected/f49c56ec-2bbb-4583-ae1f-9c83bd399228-kube-api-access-kzpzv\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:32:28.829138 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.829099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" event={"ID":"f49c56ec-2bbb-4583-ae1f-9c83bd399228","Type":"ContainerDied","Data":"e45527932ab761a722f5b0eb941fcfa1d86b406de3394a8671e49ba4f4ba6bd5"} Apr 23 18:32:28.829138 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.829122 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj" Apr 23 18:32:28.829647 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.829155 2575 scope.go:117] "RemoveContainer" containerID="3c45b0fb8ed84344c0de97097d93605afdfcfa346d02e037acfc632e97567ee4" Apr 23 18:32:28.830821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.830806 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" Apr 23 18:32:28.830943 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.830816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9" event={"ID":"80140f60-c462-4ad3-9313-f9bee99889bf","Type":"ContainerDied","Data":"3eb339035125544d13c6f17e2eaa3619cb176d724a23660e943ba261f18737d1"} Apr 23 18:32:28.838463 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.838272 2575 scope.go:117] "RemoveContainer" containerID="dc2e0823da203a7dc520e959949af619611aa7ffd3d69774115d93e5a374a96f" Apr 23 18:32:28.845916 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.845893 2575 scope.go:117] "RemoveContainer" containerID="3983b329912c85215a5430de04304b10f1add9b79648dba9ff7997eb57de6d03" Apr 23 18:32:28.851991 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.851971 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj"] Apr 23 18:32:28.853586 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.853569 2575 scope.go:117] "RemoveContainer" containerID="38b3646e1c5ca81304c6e8aaf6ddbbd779cd7f4331cfa1e6b93173304aefb42e" Apr 23 18:32:28.855206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.855185 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-83978-predictor-775f4c4947-v4grj"] Apr 23 18:32:28.866724 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.866699 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9"] Apr 23 18:32:28.869641 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:28.869619 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-83978-predictor-6788576868-g8bz9"] Apr 23 18:32:29.063557 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:29.063457 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" path="/var/lib/kubelet/pods/80140f60-c462-4ad3-9313-f9bee99889bf/volumes" Apr 23 18:32:29.064018 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:29.063998 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" path="/var/lib/kubelet/pods/f49c56ec-2bbb-4583-ae1f-9c83bd399228/volumes" Apr 23 18:32:31.820994 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:31.820963 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:32:31.821499 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:31.821408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:32:31.821499 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:31.821483 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:32:31.821817 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:31.821795 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:32:41.822302 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:41.822261 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:32:41.822711 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:41.822259 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:32:51.822106 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:51.822068 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:32:51.822482 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:32:51.822070 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:33:01.821545 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:01.821504 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:33:01.822056 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:01.822001 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:33:11.822031 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:11.821999 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:33:11.822735 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:11.822713 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:33:29.611205 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.611172 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd"] Apr 23 18:33:29.611603 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.611555 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" containerID="cri-o://d776827b2121e7d90bee610ac8f4c2ce66a598d4b650365e6313b046cdfd6b2a" gracePeriod=30 Apr 23 18:33:29.611677 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.611625 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kube-rbac-proxy" containerID="cri-o://a4e8ad19a03d98f67a33d441d665df11283b294c125d85320094bf9e5cdcb349" gracePeriod=30 Apr 23 18:33:29.665043 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665015 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz"] Apr 23 18:33:29.665390 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665378 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665392 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665402 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665408 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665414 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kube-rbac-proxy" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665419 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kube-rbac-proxy" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665426 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kube-rbac-proxy" Apr 23 18:33:29.665449 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665430 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kube-rbac-proxy" Apr 23 18:33:29.665658 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665507 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kserve-container" Apr 23 18:33:29.665658 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665518 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kserve-container" Apr 23 18:33:29.665658 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665527 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49c56ec-2bbb-4583-ae1f-9c83bd399228" containerName="kube-rbac-proxy" Apr 23 18:33:29.665658 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.665534 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="80140f60-c462-4ad3-9313-f9bee99889bf" containerName="kube-rbac-proxy" Apr 23 18:33:29.668551 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.668534 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.670632 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.670614 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-febb8-predictor-serving-cert\"" Apr 23 18:33:29.670932 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.670914 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-febb8-kube-rbac-proxy-sar-config\"" Apr 23 18:33:29.682306 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.682278 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz"] Apr 23 18:33:29.697427 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.697398 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79"] Apr 23 18:33:29.697669 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.697649 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" containerID="cri-o://0ac627c61741a6b281f85397fbc2656ea25d6282c559f058258e8fb66fc8dc27" gracePeriod=30 Apr 23 18:33:29.697763 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.697735 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kube-rbac-proxy" containerID="cri-o://bafddf275f156ee8b5096b21ec69d2d05cfb14a194301e9a732fc71ed578b1fc" gracePeriod=30 Apr 23 18:33:29.792876 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.792845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5mgf\" (UniqueName: \"kubernetes.io/projected/181be6bb-ce95-4aa3-b958-c28a19b93cee-kube-api-access-t5mgf\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.793051 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.792902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/181be6bb-ce95-4aa3-b958-c28a19b93cee-success-200-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.793051 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.792930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.806426 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.806397 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh"] Apr 23 18:33:29.814159 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.814138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.816393 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.816372 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-febb8-predictor-serving-cert\"" Apr 23 18:33:29.816524 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.816430 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-febb8-kube-rbac-proxy-sar-config\"" Apr 23 18:33:29.821305 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.821280 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh"] Apr 23 18:33:29.894213 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5mgf\" (UniqueName: \"kubernetes.io/projected/181be6bb-ce95-4aa3-b958-c28a19b93cee-kube-api-access-t5mgf\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.894213 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/181be6bb-ce95-4aa3-b958-c28a19b93cee-success-200-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.894413 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.894413 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:29.894334 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-febb8-predictor-serving-cert: secret "success-200-isvc-febb8-predictor-serving-cert" not found Apr 23 18:33:29.894413 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/2e88b719-da61-49fb-b16f-981e4d856938-kube-api-access-xr55h\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.894413 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:29.894403 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls podName:181be6bb-ce95-4aa3-b958-c28a19b93cee nodeName:}" failed. No retries permitted until 2026-04-23 18:33:30.39438317 +0000 UTC m=+2397.940358524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls") pod "success-200-isvc-febb8-predictor-7b967b5c88-c75hz" (UID: "181be6bb-ce95-4aa3-b958-c28a19b93cee") : secret "success-200-isvc-febb8-predictor-serving-cert" not found Apr 23 18:33:29.894585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.894585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e88b719-da61-49fb-b16f-981e4d856938-error-404-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.894836 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.894817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/181be6bb-ce95-4aa3-b958-c28a19b93cee-success-200-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.908340 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.908312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5mgf\" (UniqueName: \"kubernetes.io/projected/181be6bb-ce95-4aa3-b958-c28a19b93cee-kube-api-access-t5mgf\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:29.995398 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.995351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/2e88b719-da61-49fb-b16f-981e4d856938-kube-api-access-xr55h\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.995398 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.995406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.995639 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.995426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e88b719-da61-49fb-b16f-981e4d856938-error-404-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:29.995639 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:29.995553 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-febb8-predictor-serving-cert: secret "error-404-isvc-febb8-predictor-serving-cert" not found Apr 23 18:33:29.995639 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:29.995622 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls podName:2e88b719-da61-49fb-b16f-981e4d856938 nodeName:}" failed. No retries permitted until 2026-04-23 18:33:30.49560253 +0000 UTC m=+2398.041577875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls") pod "error-404-isvc-febb8-predictor-545595fd9f-5vcsh" (UID: "2e88b719-da61-49fb-b16f-981e4d856938") : secret "error-404-isvc-febb8-predictor-serving-cert" not found Apr 23 18:33:29.996048 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:29.996031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e88b719-da61-49fb-b16f-981e4d856938-error-404-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:30.004010 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.003983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/2e88b719-da61-49fb-b16f-981e4d856938-kube-api-access-xr55h\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:30.038909 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.038848 2575 generic.go:358] "Generic (PLEG): container finished" podID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerID="bafddf275f156ee8b5096b21ec69d2d05cfb14a194301e9a732fc71ed578b1fc" exitCode=2 Apr 23 18:33:30.039083 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.038920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" event={"ID":"c39b169b-ef78-41d1-a7ac-de0e3599683c","Type":"ContainerDied","Data":"bafddf275f156ee8b5096b21ec69d2d05cfb14a194301e9a732fc71ed578b1fc"} Apr 23 18:33:30.040966 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.040935 2575 generic.go:358] "Generic (PLEG): container finished" podID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerID="a4e8ad19a03d98f67a33d441d665df11283b294c125d85320094bf9e5cdcb349" exitCode=2 Apr 23 18:33:30.041077 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.040982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" event={"ID":"a04b6319-9573-4718-96c5-c45ad07a9a8f","Type":"ContainerDied","Data":"a4e8ad19a03d98f67a33d441d665df11283b294c125d85320094bf9e5cdcb349"} Apr 23 18:33:30.399203 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.399159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:30.401851 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.401827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls\") pod \"success-200-isvc-febb8-predictor-7b967b5c88-c75hz\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:30.500483 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.500424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:30.503007 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.502985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls\") pod \"error-404-isvc-febb8-predictor-545595fd9f-5vcsh\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:30.578876 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.578839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:30.713019 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.712693 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz"] Apr 23 18:33:30.715678 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:33:30.715650 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181be6bb_ce95_4aa3_b958_c28a19b93cee.slice/crio-fcaf78ee9189e4747a92a3ee3695ad3febff11bcc71e2752896e181097e705f5 WatchSource:0}: Error finding container fcaf78ee9189e4747a92a3ee3695ad3febff11bcc71e2752896e181097e705f5: Status 404 returned error can't find the container with id fcaf78ee9189e4747a92a3ee3695ad3febff11bcc71e2752896e181097e705f5 Apr 23 18:33:30.725518 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.725499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:30.856433 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:30.856407 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh"] Apr 23 18:33:30.869039 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:33:30.869009 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e88b719_da61_49fb_b16f_981e4d856938.slice/crio-216eaf70d92d57538fba0d4d28fbc20130b9242784b37fd14bef809def737d19 WatchSource:0}: Error finding container 216eaf70d92d57538fba0d4d28fbc20130b9242784b37fd14bef809def737d19: Status 404 returned error can't find the container with id 216eaf70d92d57538fba0d4d28fbc20130b9242784b37fd14bef809def737d19 Apr 23 18:33:31.046125 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.046077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" event={"ID":"2e88b719-da61-49fb-b16f-981e4d856938","Type":"ContainerStarted","Data":"d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0"} Apr 23 18:33:31.046125 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.046121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" event={"ID":"2e88b719-da61-49fb-b16f-981e4d856938","Type":"ContainerStarted","Data":"875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e"} Apr 23 18:33:31.046362 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.046136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" event={"ID":"2e88b719-da61-49fb-b16f-981e4d856938","Type":"ContainerStarted","Data":"216eaf70d92d57538fba0d4d28fbc20130b9242784b37fd14bef809def737d19"} Apr 23 18:33:31.046362 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.046249 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:31.050847 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.050820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" event={"ID":"181be6bb-ce95-4aa3-b958-c28a19b93cee","Type":"ContainerStarted","Data":"bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b"} Apr 23 18:33:31.051023 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.050850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" event={"ID":"181be6bb-ce95-4aa3-b958-c28a19b93cee","Type":"ContainerStarted","Data":"374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280"} Apr 23 18:33:31.051023 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.050861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" event={"ID":"181be6bb-ce95-4aa3-b958-c28a19b93cee","Type":"ContainerStarted","Data":"fcaf78ee9189e4747a92a3ee3695ad3febff11bcc71e2752896e181097e705f5"} Apr 23 18:33:31.051023 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.050941 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:31.084342 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.084293 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podStartSLOduration=2.084276111 podStartE2EDuration="2.084276111s" podCreationTimestamp="2026-04-23 18:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:33:31.082940608 +0000 UTC m=+2398.628915976" watchObservedRunningTime="2026-04-23 18:33:31.084276111 +0000 UTC m=+2398.630251478" Apr 23 18:33:31.085092 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:31.085062 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podStartSLOduration=2.085053738 podStartE2EDuration="2.085053738s" podCreationTimestamp="2026-04-23 18:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:33:31.065517331 +0000 UTC m=+2398.611492706" watchObservedRunningTime="2026-04-23 18:33:31.085053738 +0000 UTC m=+2398.631029105" Apr 23 18:33:32.056355 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:32.056322 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:32.056799 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:32.056365 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:32.057542 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:32.057511 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:33:32.057679 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:32.057515 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:33:32.082695 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:32.082650 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 23 18:33:32.087544 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:32.087519 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:33:33.075338 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.075303 2575 generic.go:358] "Generic (PLEG): container finished" podID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerID="d776827b2121e7d90bee610ac8f4c2ce66a598d4b650365e6313b046cdfd6b2a" exitCode=0 Apr 23 18:33:33.075652 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.075370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" event={"ID":"a04b6319-9573-4718-96c5-c45ad07a9a8f","Type":"ContainerDied","Data":"d776827b2121e7d90bee610ac8f4c2ce66a598d4b650365e6313b046cdfd6b2a"} Apr 23 18:33:33.077540 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.077509 2575 generic.go:358] "Generic (PLEG): container finished" podID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerID="0ac627c61741a6b281f85397fbc2656ea25d6282c559f058258e8fb66fc8dc27" exitCode=0 Apr 23 18:33:33.077683 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.077666 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" event={"ID":"c39b169b-ef78-41d1-a7ac-de0e3599683c","Type":"ContainerDied","Data":"0ac627c61741a6b281f85397fbc2656ea25d6282c559f058258e8fb66fc8dc27"} Apr 23 18:33:33.078376 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.078350 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:33:33.078599 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.078483 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:33:33.089152 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.089123 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 23 18:33:33.094315 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.094273 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:33:33.135222 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.135193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:33:33.143555 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.143531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:33:33.161943 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.161921 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:33:33.200791 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.200766 2575 scope.go:117] "RemoveContainer" containerID="0ac627c61741a6b281f85397fbc2656ea25d6282c559f058258e8fb66fc8dc27" Apr 23 18:33:33.209347 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.209330 2575 scope.go:117] "RemoveContainer" containerID="bafddf275f156ee8b5096b21ec69d2d05cfb14a194301e9a732fc71ed578b1fc" Apr 23 18:33:33.210234 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.210132 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:33:33.225458 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.225439 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c39b169b-ef78-41d1-a7ac-de0e3599683c-proxy-tls\") pod \"c39b169b-ef78-41d1-a7ac-de0e3599683c\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " Apr 23 18:33:33.225578 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.225496 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sls78\" (UniqueName: \"kubernetes.io/projected/c39b169b-ef78-41d1-a7ac-de0e3599683c-kube-api-access-sls78\") pod \"c39b169b-ef78-41d1-a7ac-de0e3599683c\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " Apr 23 18:33:33.225578 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.225547 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c39b169b-ef78-41d1-a7ac-de0e3599683c-success-200-isvc-02052-kube-rbac-proxy-sar-config\") pod \"c39b169b-ef78-41d1-a7ac-de0e3599683c\" (UID: \"c39b169b-ef78-41d1-a7ac-de0e3599683c\") " Apr 23 18:33:33.226020 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.225936 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39b169b-ef78-41d1-a7ac-de0e3599683c-success-200-isvc-02052-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-02052-kube-rbac-proxy-sar-config") pod "c39b169b-ef78-41d1-a7ac-de0e3599683c" (UID: "c39b169b-ef78-41d1-a7ac-de0e3599683c"). InnerVolumeSpecName "success-200-isvc-02052-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:33:33.228252 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.228214 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c39b169b-ef78-41d1-a7ac-de0e3599683c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c39b169b-ef78-41d1-a7ac-de0e3599683c" (UID: "c39b169b-ef78-41d1-a7ac-de0e3599683c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:33:33.228365 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.228274 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39b169b-ef78-41d1-a7ac-de0e3599683c-kube-api-access-sls78" (OuterVolumeSpecName: "kube-api-access-sls78") pod "c39b169b-ef78-41d1-a7ac-de0e3599683c" (UID: "c39b169b-ef78-41d1-a7ac-de0e3599683c"). InnerVolumeSpecName "kube-api-access-sls78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:33:33.326467 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326429 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04b6319-9573-4718-96c5-c45ad07a9a8f-error-404-isvc-02052-kube-rbac-proxy-sar-config\") pod \"a04b6319-9573-4718-96c5-c45ad07a9a8f\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " Apr 23 18:33:33.326640 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326513 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp5x2\" (UniqueName: \"kubernetes.io/projected/a04b6319-9573-4718-96c5-c45ad07a9a8f-kube-api-access-qp5x2\") pod \"a04b6319-9573-4718-96c5-c45ad07a9a8f\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " Apr 23 18:33:33.326640 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326549 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04b6319-9573-4718-96c5-c45ad07a9a8f-proxy-tls\") pod \"a04b6319-9573-4718-96c5-c45ad07a9a8f\" (UID: \"a04b6319-9573-4718-96c5-c45ad07a9a8f\") " Apr 23 18:33:33.326758 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326702 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c39b169b-ef78-41d1-a7ac-de0e3599683c-success-200-isvc-02052-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:33.326758 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326713 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c39b169b-ef78-41d1-a7ac-de0e3599683c-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:33.326758 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326723 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sls78\" (UniqueName: \"kubernetes.io/projected/c39b169b-ef78-41d1-a7ac-de0e3599683c-kube-api-access-sls78\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:33.326971 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.326751 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04b6319-9573-4718-96c5-c45ad07a9a8f-error-404-isvc-02052-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-02052-kube-rbac-proxy-sar-config") pod "a04b6319-9573-4718-96c5-c45ad07a9a8f" (UID: "a04b6319-9573-4718-96c5-c45ad07a9a8f"). InnerVolumeSpecName "error-404-isvc-02052-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:33:33.328661 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.328641 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04b6319-9573-4718-96c5-c45ad07a9a8f-kube-api-access-qp5x2" (OuterVolumeSpecName: "kube-api-access-qp5x2") pod "a04b6319-9573-4718-96c5-c45ad07a9a8f" (UID: "a04b6319-9573-4718-96c5-c45ad07a9a8f"). InnerVolumeSpecName "kube-api-access-qp5x2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:33:33.328755 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.328711 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04b6319-9573-4718-96c5-c45ad07a9a8f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a04b6319-9573-4718-96c5-c45ad07a9a8f" (UID: "a04b6319-9573-4718-96c5-c45ad07a9a8f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:33:33.427626 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.427590 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp5x2\" (UniqueName: \"kubernetes.io/projected/a04b6319-9573-4718-96c5-c45ad07a9a8f-kube-api-access-qp5x2\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:33.427626 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.427618 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04b6319-9573-4718-96c5-c45ad07a9a8f-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:33.427626 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:33.427629 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-02052-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04b6319-9573-4718-96c5-c45ad07a9a8f-error-404-isvc-02052-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:34.080571 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.080484 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" event={"ID":"c39b169b-ef78-41d1-a7ac-de0e3599683c","Type":"ContainerDied","Data":"4e41d36f6ef42a21b90533b2f932550a3735093707038662d24fc7bff207f639"} Apr 23 18:33:34.082032 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.082005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79" Apr 23 18:33:34.082032 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.082018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" event={"ID":"a04b6319-9573-4718-96c5-c45ad07a9a8f","Type":"ContainerDied","Data":"3277b63d3d61bcef1a11d9d14c39daeecce987b61458bdde045e27b7bd2a73a5"} Apr 23 18:33:34.082217 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.082050 2575 scope.go:117] "RemoveContainer" containerID="a4e8ad19a03d98f67a33d441d665df11283b294c125d85320094bf9e5cdcb349" Apr 23 18:33:34.082217 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.082006 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd" Apr 23 18:33:34.090849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.090742 2575 scope.go:117] "RemoveContainer" containerID="d776827b2121e7d90bee610ac8f4c2ce66a598d4b650365e6313b046cdfd6b2a" Apr 23 18:33:34.124547 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.124521 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79"] Apr 23 18:33:34.130224 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.130198 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-02052-predictor-7f966b6f9b-7bm79"] Apr 23 18:33:34.141037 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.141013 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd"] Apr 23 18:33:34.146031 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:34.146010 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-02052-predictor-7ff4bf9679-cdttd"] Apr 23 18:33:35.061555 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:35.061512 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" path="/var/lib/kubelet/pods/a04b6319-9573-4718-96c5-c45ad07a9a8f/volumes" Apr 23 18:33:35.062006 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:35.061981 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" path="/var/lib/kubelet/pods/c39b169b-ef78-41d1-a7ac-de0e3599683c/volumes" Apr 23 18:33:38.082610 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.082580 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:33:38.083081 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.082967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:33:38.083081 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.082977 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:33:38.083468 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.083414 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:33:38.255101 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.255069 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr"] Apr 23 18:33:38.255377 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.255336 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" containerID="cri-o://ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a" gracePeriod=30 Apr 23 18:33:38.255478 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.255406 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kube-rbac-proxy" containerID="cri-o://348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba" gracePeriod=30 Apr 23 18:33:38.277260 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277226 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx"] Apr 23 18:33:38.277627 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277614 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" Apr 23 18:33:38.277671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277630 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" Apr 23 18:33:38.277671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277640 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kube-rbac-proxy" Apr 23 18:33:38.277671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277645 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kube-rbac-proxy" Apr 23 18:33:38.277671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277658 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" Apr 23 18:33:38.277671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277663 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" Apr 23 18:33:38.277671 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277670 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kube-rbac-proxy" Apr 23 18:33:38.277849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277675 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kube-rbac-proxy" Apr 23 18:33:38.277849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277740 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kserve-container" Apr 23 18:33:38.277849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277748 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c39b169b-ef78-41d1-a7ac-de0e3599683c" containerName="kube-rbac-proxy" Apr 23 18:33:38.277849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277754 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kserve-container" Apr 23 18:33:38.277849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.277760 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a04b6319-9573-4718-96c5-c45ad07a9a8f" containerName="kube-rbac-proxy" Apr 23 18:33:38.283102 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.283083 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.284990 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.284962 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-760ed-kube-rbac-proxy-sar-config\"" Apr 23 18:33:38.285085 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.285004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-760ed-predictor-serving-cert\"" Apr 23 18:33:38.290467 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.290445 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx"] Apr 23 18:33:38.356235 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.356161 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7"] Apr 23 18:33:38.356469 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.356440 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" containerID="cri-o://6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7" gracePeriod=30 Apr 23 18:33:38.356606 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.356486 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kube-rbac-proxy" containerID="cri-o://d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4" gracePeriod=30 Apr 23 18:33:38.374098 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.374063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97047322-b35c-453c-858d-7f2bbb7c8806-proxy-tls\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.374208 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.374160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jcz\" (UniqueName: \"kubernetes.io/projected/97047322-b35c-453c-858d-7f2bbb7c8806-kube-api-access-l5jcz\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.374257 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.374232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97047322-b35c-453c-858d-7f2bbb7c8806-success-200-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.406832 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.406807 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh"] Apr 23 18:33:38.410282 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.410268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.412054 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.412035 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-760ed-predictor-serving-cert\"" Apr 23 18:33:38.412156 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.412044 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-760ed-kube-rbac-proxy-sar-config\"" Apr 23 18:33:38.419603 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.419580 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh"] Apr 23 18:33:38.474782 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.474757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97047322-b35c-453c-858d-7f2bbb7c8806-proxy-tls\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.474907 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.474812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jcz\" (UniqueName: \"kubernetes.io/projected/97047322-b35c-453c-858d-7f2bbb7c8806-kube-api-access-l5jcz\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.474907 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.474854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97047322-b35c-453c-858d-7f2bbb7c8806-success-200-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.475493 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.475424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97047322-b35c-453c-858d-7f2bbb7c8806-success-200-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.477211 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.477181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97047322-b35c-453c-858d-7f2bbb7c8806-proxy-tls\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.482429 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.482406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jcz\" (UniqueName: \"kubernetes.io/projected/97047322-b35c-453c-858d-7f2bbb7c8806-kube-api-access-l5jcz\") pod \"success-200-isvc-760ed-predictor-5856fb9f74-2rclx\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.576070 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.576032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8dc829a-d241-4993-b315-4bb04d2cff26-error-404-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.576070 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.576079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8dc829a-d241-4993-b315-4bb04d2cff26-proxy-tls\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.576275 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.576168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpnjg\" (UniqueName: \"kubernetes.io/projected/c8dc829a-d241-4993-b315-4bb04d2cff26-kube-api-access-tpnjg\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.595978 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.595936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:38.677520 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.677487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpnjg\" (UniqueName: \"kubernetes.io/projected/c8dc829a-d241-4993-b315-4bb04d2cff26-kube-api-access-tpnjg\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.677694 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.677573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8dc829a-d241-4993-b315-4bb04d2cff26-error-404-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.677694 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.677606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8dc829a-d241-4993-b315-4bb04d2cff26-proxy-tls\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.678385 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.678286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8dc829a-d241-4993-b315-4bb04d2cff26-error-404-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.680490 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.680465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8dc829a-d241-4993-b315-4bb04d2cff26-proxy-tls\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.686729 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.686706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpnjg\" (UniqueName: \"kubernetes.io/projected/c8dc829a-d241-4993-b315-4bb04d2cff26-kube-api-access-tpnjg\") pod \"error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.721564 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.721535 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:38.727166 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.727142 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx"] Apr 23 18:33:38.730610 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:33:38.730586 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97047322_b35c_453c_858d_7f2bbb7c8806.slice/crio-d72fe512150515bf4f0ef30394537f0ed1f5996e6847c346b542e015fc393eba WatchSource:0}: Error finding container d72fe512150515bf4f0ef30394537f0ed1f5996e6847c346b542e015fc393eba: Status 404 returned error can't find the container with id d72fe512150515bf4f0ef30394537f0ed1f5996e6847c346b542e015fc393eba Apr 23 18:33:38.856284 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:38.856094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh"] Apr 23 18:33:38.860287 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:33:38.860261 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8dc829a_d241_4993_b315_4bb04d2cff26.slice/crio-c5629ef9d4544c07d4316bd640b4b4432c85c4bac0f5eeba9e2dfe3e65cc03ea WatchSource:0}: Error finding container c5629ef9d4544c07d4316bd640b4b4432c85c4bac0f5eeba9e2dfe3e65cc03ea: Status 404 returned error can't find the container with id c5629ef9d4544c07d4316bd640b4b4432c85c4bac0f5eeba9e2dfe3e65cc03ea Apr 23 18:33:39.100372 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.100340 2575 generic.go:358] "Generic (PLEG): container finished" podID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerID="d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4" exitCode=2 Apr 23 18:33:39.100773 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.100420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" event={"ID":"bff60ad1-536a-4cf2-832c-a469d7aba58b","Type":"ContainerDied","Data":"d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4"} Apr 23 18:33:39.102023 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.102001 2575 generic.go:358] "Generic (PLEG): container finished" podID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerID="348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba" exitCode=2 Apr 23 18:33:39.102132 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.102059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" event={"ID":"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4","Type":"ContainerDied","Data":"348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba"} Apr 23 18:33:39.103383 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.103364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" event={"ID":"c8dc829a-d241-4993-b315-4bb04d2cff26","Type":"ContainerStarted","Data":"be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053"} Apr 23 18:33:39.103498 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.103391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" event={"ID":"c8dc829a-d241-4993-b315-4bb04d2cff26","Type":"ContainerStarted","Data":"402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115"} Apr 23 18:33:39.103498 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.103406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" event={"ID":"c8dc829a-d241-4993-b315-4bb04d2cff26","Type":"ContainerStarted","Data":"c5629ef9d4544c07d4316bd640b4b4432c85c4bac0f5eeba9e2dfe3e65cc03ea"} Apr 23 18:33:39.103608 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.103496 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:39.105129 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.105106 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" event={"ID":"97047322-b35c-453c-858d-7f2bbb7c8806","Type":"ContainerStarted","Data":"d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee"} Apr 23 18:33:39.105206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.105136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" event={"ID":"97047322-b35c-453c-858d-7f2bbb7c8806","Type":"ContainerStarted","Data":"4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e"} Apr 23 18:33:39.105206 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.105151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" event={"ID":"97047322-b35c-453c-858d-7f2bbb7c8806","Type":"ContainerStarted","Data":"d72fe512150515bf4f0ef30394537f0ed1f5996e6847c346b542e015fc393eba"} Apr 23 18:33:39.105287 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.105276 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:39.121625 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.121587 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podStartSLOduration=1.121576823 podStartE2EDuration="1.121576823s" podCreationTimestamp="2026-04-23 18:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:33:39.119639935 +0000 UTC m=+2406.665615302" watchObservedRunningTime="2026-04-23 18:33:39.121576823 +0000 UTC m=+2406.667552190" Apr 23 18:33:39.136604 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:39.136568 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podStartSLOduration=1.136556893 podStartE2EDuration="1.136556893s" podCreationTimestamp="2026-04-23 18:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:33:39.1355939 +0000 UTC m=+2406.681569267" watchObservedRunningTime="2026-04-23 18:33:39.136556893 +0000 UTC m=+2406.682532312" Apr 23 18:33:40.108796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:40.108757 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:40.108796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:40.108797 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:40.109859 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:40.109830 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:33:40.110042 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:40.110018 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:33:41.111736 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.111693 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:33:41.112135 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.111776 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:33:41.817192 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.817151 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:33:41.817192 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.817166 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 23 18:33:41.821740 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.821688 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 18:33:41.821924 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.821902 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:33:41.900899 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:41.900856 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:33:42.010518 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.010481 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bff60ad1-536a-4cf2-832c-a469d7aba58b-proxy-tls\") pod \"bff60ad1-536a-4cf2-832c-a469d7aba58b\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " Apr 23 18:33:42.010673 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.010550 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bff60ad1-536a-4cf2-832c-a469d7aba58b-error-404-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"bff60ad1-536a-4cf2-832c-a469d7aba58b\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " Apr 23 18:33:42.010673 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.010665 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwhm2\" (UniqueName: \"kubernetes.io/projected/bff60ad1-536a-4cf2-832c-a469d7aba58b-kube-api-access-fwhm2\") pod \"bff60ad1-536a-4cf2-832c-a469d7aba58b\" (UID: \"bff60ad1-536a-4cf2-832c-a469d7aba58b\") " Apr 23 18:33:42.011046 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.010989 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bff60ad1-536a-4cf2-832c-a469d7aba58b-error-404-isvc-60f5a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-60f5a-kube-rbac-proxy-sar-config") pod "bff60ad1-536a-4cf2-832c-a469d7aba58b" (UID: "bff60ad1-536a-4cf2-832c-a469d7aba58b"). InnerVolumeSpecName "error-404-isvc-60f5a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:33:42.012913 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.012867 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff60ad1-536a-4cf2-832c-a469d7aba58b-kube-api-access-fwhm2" (OuterVolumeSpecName: "kube-api-access-fwhm2") pod "bff60ad1-536a-4cf2-832c-a469d7aba58b" (UID: "bff60ad1-536a-4cf2-832c-a469d7aba58b"). InnerVolumeSpecName "kube-api-access-fwhm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:33:42.012971 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.012875 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff60ad1-536a-4cf2-832c-a469d7aba58b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bff60ad1-536a-4cf2-832c-a469d7aba58b" (UID: "bff60ad1-536a-4cf2-832c-a469d7aba58b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:33:42.111782 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.111750 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwhm2\" (UniqueName: \"kubernetes.io/projected/bff60ad1-536a-4cf2-832c-a469d7aba58b-kube-api-access-fwhm2\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:42.111782 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.111780 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bff60ad1-536a-4cf2-832c-a469d7aba58b-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:42.112200 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.111791 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bff60ad1-536a-4cf2-832c-a469d7aba58b-error-404-isvc-60f5a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:42.115381 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.115350 2575 generic.go:358] "Generic (PLEG): container finished" podID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerID="6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7" exitCode=0 Apr 23 18:33:42.115507 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.115420 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" Apr 23 18:33:42.115507 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.115432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" event={"ID":"bff60ad1-536a-4cf2-832c-a469d7aba58b","Type":"ContainerDied","Data":"6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7"} Apr 23 18:33:42.115507 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.115472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7" event={"ID":"bff60ad1-536a-4cf2-832c-a469d7aba58b","Type":"ContainerDied","Data":"d013013d3fb5c01eeee263d4a6e64394377b81a13970920a44713850ad6f2105"} Apr 23 18:33:42.115507 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.115492 2575 scope.go:117] "RemoveContainer" containerID="d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4" Apr 23 18:33:42.135715 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.135694 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7"] Apr 23 18:33:42.140262 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.140226 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60f5a-predictor-7646d945df-9jfr7"] Apr 23 18:33:42.168401 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.168365 2575 scope.go:117] "RemoveContainer" containerID="6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7" Apr 23 18:33:42.176097 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.176074 2575 scope.go:117] "RemoveContainer" containerID="d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4" Apr 23 18:33:42.176346 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:42.176318 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4\": container with ID starting with d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4 not found: ID does not exist" containerID="d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4" Apr 23 18:33:42.176401 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.176359 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4"} err="failed to get container status \"d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4\": rpc error: code = NotFound desc = could not find container \"d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4\": container with ID starting with d19b4d90eaeaa5139b4f70c124afa006d47443f44238e4f12817d984f8f6dbc4 not found: ID does not exist" Apr 23 18:33:42.176401 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.176385 2575 scope.go:117] "RemoveContainer" containerID="6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7" Apr 23 18:33:42.176641 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:42.176622 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7\": container with ID starting with 6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7 not found: ID does not exist" containerID="6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7" Apr 23 18:33:42.176694 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.176650 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7"} err="failed to get container status \"6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7\": rpc error: code = NotFound desc = could not find container \"6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7\": container with ID starting with 6a41c3aadd3327d033c68e9fb8162172d2b82ac451a86e0a1fb408d4714f1de7 not found: ID does not exist" Apr 23 18:33:42.295663 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.295640 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:33:42.413984 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.413952 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhqs4\" (UniqueName: \"kubernetes.io/projected/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-kube-api-access-jhqs4\") pod \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " Apr 23 18:33:42.414151 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.414006 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls\") pod \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " Apr 23 18:33:42.414151 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.414042 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-success-200-isvc-60f5a-kube-rbac-proxy-sar-config\") pod \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\" (UID: \"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4\") " Apr 23 18:33:42.414443 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.414415 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-success-200-isvc-60f5a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-60f5a-kube-rbac-proxy-sar-config") pod "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" (UID: "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4"). InnerVolumeSpecName "success-200-isvc-60f5a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:33:42.416245 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.416224 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" (UID: "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:33:42.416303 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.416283 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-kube-api-access-jhqs4" (OuterVolumeSpecName: "kube-api-access-jhqs4") pod "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" (UID: "a4ae74b8-bf02-4534-b6aa-1a00bcd795a4"). InnerVolumeSpecName "kube-api-access-jhqs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:33:42.515530 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.515489 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-60f5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-success-200-isvc-60f5a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:42.515530 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.515522 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhqs4\" (UniqueName: \"kubernetes.io/projected/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-kube-api-access-jhqs4\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:42.515530 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:42.515533 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:33:43.063188 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.063107 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" path="/var/lib/kubelet/pods/bff60ad1-536a-4cf2-832c-a469d7aba58b/volumes" Apr 23 18:33:43.120737 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.120707 2575 generic.go:358] "Generic (PLEG): container finished" podID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerID="ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a" exitCode=0 Apr 23 18:33:43.121150 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.120770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" event={"ID":"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4","Type":"ContainerDied","Data":"ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a"} Apr 23 18:33:43.121150 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.120787 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" Apr 23 18:33:43.121150 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.120803 2575 scope.go:117] "RemoveContainer" containerID="348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba" Apr 23 18:33:43.121150 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.120792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr" event={"ID":"a4ae74b8-bf02-4534-b6aa-1a00bcd795a4","Type":"ContainerDied","Data":"70d7a47a96efed957856ac80ef1977c0ec2792f3e1140286c9e6856ae5f082bc"} Apr 23 18:33:43.128984 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.128965 2575 scope.go:117] "RemoveContainer" containerID="ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a" Apr 23 18:33:43.136706 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.136685 2575 scope.go:117] "RemoveContainer" containerID="348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba" Apr 23 18:33:43.137002 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:43.136980 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba\": container with ID starting with 348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba not found: ID does not exist" containerID="348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba" Apr 23 18:33:43.137113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.137008 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba"} err="failed to get container status \"348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba\": rpc error: code = NotFound desc = could not find container \"348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba\": container with ID starting with 348ee08ed3121fe3b14ca9528886b0ca65aae544fc226ed243209858b30813ba not found: ID does not exist" Apr 23 18:33:43.137113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.137026 2575 scope.go:117] "RemoveContainer" containerID="ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a" Apr 23 18:33:43.137322 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:33:43.137297 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a\": container with ID starting with ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a not found: ID does not exist" containerID="ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a" Apr 23 18:33:43.137430 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.137329 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a"} err="failed to get container status \"ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a\": rpc error: code = NotFound desc = could not find container \"ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a\": container with ID starting with ef7cb57e23b191af27462124b4830f73a93b7de23138953113eb8dc2cb3e324a not found: ID does not exist" Apr 23 18:33:43.138282 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.138259 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr"] Apr 23 18:33:43.141478 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:43.141458 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60f5a-predictor-6cbdd87d54-zcvbr"] Apr 23 18:33:45.064040 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:45.063994 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" path="/var/lib/kubelet/pods/a4ae74b8-bf02-4534-b6aa-1a00bcd795a4/volumes" Apr 23 18:33:46.115804 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:46.115776 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:33:46.116215 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:46.116018 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:33:46.116456 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:46.116427 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:33:46.116524 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:46.116428 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:33:48.083106 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:48.083059 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:33:48.083854 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:48.083495 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:33:56.117057 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:56.117017 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:33:56.117431 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:56.117017 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:33:58.083142 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:58.083079 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:33:58.083580 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:33:58.083446 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:34:06.116523 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:06.116476 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:34:06.117033 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:06.116477 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:34:08.083617 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:08.083579 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:34:08.084030 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:08.083577 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:34:16.116537 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:16.116486 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:34:16.116537 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:16.116515 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:34:18.083611 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:18.083571 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:34:18.084069 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:18.084015 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:34:26.117179 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:26.117138 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:34:26.117847 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:26.117826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:34:28.083751 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:28.083722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:34:36.117128 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:34:36.117098 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:38:33.168796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:38:33.168762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:38:33.177274 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:38:33.177249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:42:53.193739 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.193704 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx"] Apr 23 18:42:53.194269 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.193996 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" containerID="cri-o://4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e" gracePeriod=30 Apr 23 18:42:53.194269 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.194039 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kube-rbac-proxy" containerID="cri-o://d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee" gracePeriod=30 Apr 23 18:42:53.223003 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.222978 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh"] Apr 23 18:42:53.223286 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.223261 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" containerID="cri-o://402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115" gracePeriod=30 Apr 23 18:42:53.223366 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.223272 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kube-rbac-proxy" containerID="cri-o://be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053" gracePeriod=30 Apr 23 18:42:53.981532 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.981499 2575 generic.go:358] "Generic (PLEG): container finished" podID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerID="be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053" exitCode=2 Apr 23 18:42:53.981744 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.981570 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" event={"ID":"c8dc829a-d241-4993-b315-4bb04d2cff26","Type":"ContainerDied","Data":"be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053"} Apr 23 18:42:53.983096 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.983070 2575 generic.go:358] "Generic (PLEG): container finished" podID="97047322-b35c-453c-858d-7f2bbb7c8806" containerID="d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee" exitCode=2 Apr 23 18:42:53.983192 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:53.983141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" event={"ID":"97047322-b35c-453c-858d-7f2bbb7c8806","Type":"ContainerDied","Data":"d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee"} Apr 23 18:42:56.112423 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.112384 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:42:56.112724 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.112384 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 23 18:42:56.116658 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.116627 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:42:56.116772 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.116663 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:42:56.233805 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.233782 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:42:56.283894 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.283853 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97047322-b35c-453c-858d-7f2bbb7c8806-proxy-tls\") pod \"97047322-b35c-453c-858d-7f2bbb7c8806\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " Apr 23 18:42:56.284066 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.283939 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97047322-b35c-453c-858d-7f2bbb7c8806-success-200-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"97047322-b35c-453c-858d-7f2bbb7c8806\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " Apr 23 18:42:56.284066 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.283972 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5jcz\" (UniqueName: \"kubernetes.io/projected/97047322-b35c-453c-858d-7f2bbb7c8806-kube-api-access-l5jcz\") pod \"97047322-b35c-453c-858d-7f2bbb7c8806\" (UID: \"97047322-b35c-453c-858d-7f2bbb7c8806\") " Apr 23 18:42:56.284337 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.284309 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97047322-b35c-453c-858d-7f2bbb7c8806-success-200-isvc-760ed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-760ed-kube-rbac-proxy-sar-config") pod "97047322-b35c-453c-858d-7f2bbb7c8806" (UID: "97047322-b35c-453c-858d-7f2bbb7c8806"). InnerVolumeSpecName "success-200-isvc-760ed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:56.286216 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.286191 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97047322-b35c-453c-858d-7f2bbb7c8806-kube-api-access-l5jcz" (OuterVolumeSpecName: "kube-api-access-l5jcz") pod "97047322-b35c-453c-858d-7f2bbb7c8806" (UID: "97047322-b35c-453c-858d-7f2bbb7c8806"). InnerVolumeSpecName "kube-api-access-l5jcz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:42:56.286319 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.286271 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97047322-b35c-453c-858d-7f2bbb7c8806-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "97047322-b35c-453c-858d-7f2bbb7c8806" (UID: "97047322-b35c-453c-858d-7f2bbb7c8806"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:42:56.384952 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.384852 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l5jcz\" (UniqueName: \"kubernetes.io/projected/97047322-b35c-453c-858d-7f2bbb7c8806-kube-api-access-l5jcz\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:42:56.384952 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.384909 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97047322-b35c-453c-858d-7f2bbb7c8806-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:42:56.384952 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.384922 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97047322-b35c-453c-858d-7f2bbb7c8806-success-200-isvc-760ed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:42:56.993533 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.993493 2575 generic.go:358] "Generic (PLEG): container finished" podID="97047322-b35c-453c-858d-7f2bbb7c8806" containerID="4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e" exitCode=0 Apr 23 18:42:56.993727 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.993534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" event={"ID":"97047322-b35c-453c-858d-7f2bbb7c8806","Type":"ContainerDied","Data":"4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e"} Apr 23 18:42:56.993727 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.993573 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" Apr 23 18:42:56.993727 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.993589 2575 scope.go:117] "RemoveContainer" containerID="d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee" Apr 23 18:42:56.993727 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:56.993579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx" event={"ID":"97047322-b35c-453c-858d-7f2bbb7c8806","Type":"ContainerDied","Data":"d72fe512150515bf4f0ef30394537f0ed1f5996e6847c346b542e015fc393eba"} Apr 23 18:42:57.008766 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.008745 2575 scope.go:117] "RemoveContainer" containerID="4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e" Apr 23 18:42:57.016650 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.016573 2575 scope.go:117] "RemoveContainer" containerID="d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee" Apr 23 18:42:57.017066 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:42:57.017031 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee\": container with ID starting with d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee not found: ID does not exist" containerID="d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee" Apr 23 18:42:57.017145 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.017077 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee"} err="failed to get container status \"d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee\": rpc error: code = NotFound desc = could not find container \"d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee\": container with ID starting with d7d960941f5b298aca18d8be2823a946ab2233b91f62c97a11f951dd6a356fee not found: ID does not exist" Apr 23 18:42:57.017145 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.017102 2575 scope.go:117] "RemoveContainer" containerID="4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e" Apr 23 18:42:57.017394 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:42:57.017369 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e\": container with ID starting with 4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e not found: ID does not exist" containerID="4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e" Apr 23 18:42:57.017440 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.017405 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e"} err="failed to get container status \"4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e\": rpc error: code = NotFound desc = could not find container \"4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e\": container with ID starting with 4715c9d744301b341a49191c058966dc2d4b7c3f3783b16337f4b4f7cbaa3d4e not found: ID does not exist" Apr 23 18:42:57.018278 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.018261 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx"] Apr 23 18:42:57.021494 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.021473 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-760ed-predictor-5856fb9f74-2rclx"] Apr 23 18:42:57.061461 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:42:57.061428 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" path="/var/lib/kubelet/pods/97047322-b35c-453c-858d-7f2bbb7c8806/volumes" Apr 23 18:43:00.571113 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.571089 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:43:00.622368 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.622341 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8dc829a-d241-4993-b315-4bb04d2cff26-error-404-isvc-760ed-kube-rbac-proxy-sar-config\") pod \"c8dc829a-d241-4993-b315-4bb04d2cff26\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " Apr 23 18:43:00.622546 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.622376 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpnjg\" (UniqueName: \"kubernetes.io/projected/c8dc829a-d241-4993-b315-4bb04d2cff26-kube-api-access-tpnjg\") pod \"c8dc829a-d241-4993-b315-4bb04d2cff26\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " Apr 23 18:43:00.622546 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.622408 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8dc829a-d241-4993-b315-4bb04d2cff26-proxy-tls\") pod \"c8dc829a-d241-4993-b315-4bb04d2cff26\" (UID: \"c8dc829a-d241-4993-b315-4bb04d2cff26\") " Apr 23 18:43:00.622767 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.622740 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8dc829a-d241-4993-b315-4bb04d2cff26-error-404-isvc-760ed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-760ed-kube-rbac-proxy-sar-config") pod "c8dc829a-d241-4993-b315-4bb04d2cff26" (UID: "c8dc829a-d241-4993-b315-4bb04d2cff26"). InnerVolumeSpecName "error-404-isvc-760ed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:43:00.624662 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.624640 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc829a-d241-4993-b315-4bb04d2cff26-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c8dc829a-d241-4993-b315-4bb04d2cff26" (UID: "c8dc829a-d241-4993-b315-4bb04d2cff26"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:43:00.624758 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.624663 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8dc829a-d241-4993-b315-4bb04d2cff26-kube-api-access-tpnjg" (OuterVolumeSpecName: "kube-api-access-tpnjg") pod "c8dc829a-d241-4993-b315-4bb04d2cff26" (UID: "c8dc829a-d241-4993-b315-4bb04d2cff26"). InnerVolumeSpecName "kube-api-access-tpnjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:43:00.723514 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.723439 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-760ed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8dc829a-d241-4993-b315-4bb04d2cff26-error-404-isvc-760ed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:43:00.723514 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.723464 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpnjg\" (UniqueName: \"kubernetes.io/projected/c8dc829a-d241-4993-b315-4bb04d2cff26-kube-api-access-tpnjg\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:43:00.723514 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:00.723478 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8dc829a-d241-4993-b315-4bb04d2cff26-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:43:01.009378 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.009291 2575 generic.go:358] "Generic (PLEG): container finished" podID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerID="402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115" exitCode=0 Apr 23 18:43:01.009378 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.009343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" event={"ID":"c8dc829a-d241-4993-b315-4bb04d2cff26","Type":"ContainerDied","Data":"402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115"} Apr 23 18:43:01.009378 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.009369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" event={"ID":"c8dc829a-d241-4993-b315-4bb04d2cff26","Type":"ContainerDied","Data":"c5629ef9d4544c07d4316bd640b4b4432c85c4bac0f5eeba9e2dfe3e65cc03ea"} Apr 23 18:43:01.009378 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.009366 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh" Apr 23 18:43:01.009378 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.009380 2575 scope.go:117] "RemoveContainer" containerID="be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053" Apr 23 18:43:01.018016 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.017996 2575 scope.go:117] "RemoveContainer" containerID="402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115" Apr 23 18:43:01.025503 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.025480 2575 scope.go:117] "RemoveContainer" containerID="be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053" Apr 23 18:43:01.025750 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:43:01.025727 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053\": container with ID starting with be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053 not found: ID does not exist" containerID="be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053" Apr 23 18:43:01.025796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.025764 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053"} err="failed to get container status \"be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053\": rpc error: code = NotFound desc = could not find container \"be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053\": container with ID starting with be56f6fba477cbddc942cad1cf38abc7ed62cb8725b67f6abc669be570f09053 not found: ID does not exist" Apr 23 18:43:01.025796 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.025790 2575 scope.go:117] "RemoveContainer" containerID="402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115" Apr 23 18:43:01.026062 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:43:01.026043 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115\": container with ID starting with 402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115 not found: ID does not exist" containerID="402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115" Apr 23 18:43:01.026122 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.026068 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115"} err="failed to get container status \"402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115\": rpc error: code = NotFound desc = could not find container \"402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115\": container with ID starting with 402685e1c7c3e4b58da7a2931fe2121765c9cbcdc4e15c01cdbf58074ae86115 not found: ID does not exist" Apr 23 18:43:01.032190 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.032168 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh"] Apr 23 18:43:01.036173 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.036153 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-760ed-predictor-5fd5f5bd9-f8hbh"] Apr 23 18:43:01.063423 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:01.063391 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" path="/var/lib/kubelet/pods/c8dc829a-d241-4993-b315-4bb04d2cff26/volumes" Apr 23 18:43:33.193730 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:33.193615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:43:33.201751 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:43:33.201733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:48:33.215263 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:48:33.215151 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:48:33.224272 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:48:33.224248 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:50:59.173376 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.173339 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz"] Apr 23 18:50:59.173863 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.173629 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" containerID="cri-o://374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280" gracePeriod=30 Apr 23 18:50:59.173863 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.173684 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kube-rbac-proxy" containerID="cri-o://bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b" gracePeriod=30 Apr 23 18:50:59.228445 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.228416 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh"] Apr 23 18:50:59.228704 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.228669 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" containerID="cri-o://875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e" gracePeriod=30 Apr 23 18:50:59.228776 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.228698 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kube-rbac-proxy" containerID="cri-o://d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0" gracePeriod=30 Apr 23 18:50:59.429182 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.429091 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e88b719-da61-49fb-b16f-981e4d856938" containerID="d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0" exitCode=2 Apr 23 18:50:59.429182 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.429157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" event={"ID":"2e88b719-da61-49fb-b16f-981e4d856938","Type":"ContainerDied","Data":"d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0"} Apr 23 18:50:59.430735 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.430712 2575 generic.go:358] "Generic (PLEG): container finished" podID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerID="bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b" exitCode=2 Apr 23 18:50:59.430860 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:50:59.430739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" event={"ID":"181be6bb-ce95-4aa3-b958-c28a19b93cee","Type":"ContainerDied","Data":"bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b"} Apr 23 18:51:02.121101 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.121074 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:51:02.211188 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.211105 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5mgf\" (UniqueName: \"kubernetes.io/projected/181be6bb-ce95-4aa3-b958-c28a19b93cee-kube-api-access-t5mgf\") pod \"181be6bb-ce95-4aa3-b958-c28a19b93cee\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " Apr 23 18:51:02.211188 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.211188 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/181be6bb-ce95-4aa3-b958-c28a19b93cee-success-200-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"181be6bb-ce95-4aa3-b958-c28a19b93cee\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " Apr 23 18:51:02.211391 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.211208 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls\") pod \"181be6bb-ce95-4aa3-b958-c28a19b93cee\" (UID: \"181be6bb-ce95-4aa3-b958-c28a19b93cee\") " Apr 23 18:51:02.211559 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.211537 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181be6bb-ce95-4aa3-b958-c28a19b93cee-success-200-isvc-febb8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-febb8-kube-rbac-proxy-sar-config") pod "181be6bb-ce95-4aa3-b958-c28a19b93cee" (UID: "181be6bb-ce95-4aa3-b958-c28a19b93cee"). InnerVolumeSpecName "success-200-isvc-febb8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:51:02.213332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.213311 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "181be6bb-ce95-4aa3-b958-c28a19b93cee" (UID: "181be6bb-ce95-4aa3-b958-c28a19b93cee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:51:02.213332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.213317 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181be6bb-ce95-4aa3-b958-c28a19b93cee-kube-api-access-t5mgf" (OuterVolumeSpecName: "kube-api-access-t5mgf") pod "181be6bb-ce95-4aa3-b958-c28a19b93cee" (UID: "181be6bb-ce95-4aa3-b958-c28a19b93cee"). InnerVolumeSpecName "kube-api-access-t5mgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:51:02.312704 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.312639 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5mgf\" (UniqueName: \"kubernetes.io/projected/181be6bb-ce95-4aa3-b958-c28a19b93cee-kube-api-access-t5mgf\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:51:02.312704 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.312678 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/181be6bb-ce95-4aa3-b958-c28a19b93cee-success-200-isvc-febb8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:51:02.312704 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.312695 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/181be6bb-ce95-4aa3-b958-c28a19b93cee-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:51:02.362130 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.362105 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:51:02.413437 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.413406 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls\") pod \"2e88b719-da61-49fb-b16f-981e4d856938\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " Apr 23 18:51:02.413577 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.413542 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e88b719-da61-49fb-b16f-981e4d856938-error-404-isvc-febb8-kube-rbac-proxy-sar-config\") pod \"2e88b719-da61-49fb-b16f-981e4d856938\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " Apr 23 18:51:02.413632 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.413619 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/2e88b719-da61-49fb-b16f-981e4d856938-kube-api-access-xr55h\") pod \"2e88b719-da61-49fb-b16f-981e4d856938\" (UID: \"2e88b719-da61-49fb-b16f-981e4d856938\") " Apr 23 18:51:02.413849 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.413828 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e88b719-da61-49fb-b16f-981e4d856938-error-404-isvc-febb8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-febb8-kube-rbac-proxy-sar-config") pod "2e88b719-da61-49fb-b16f-981e4d856938" (UID: "2e88b719-da61-49fb-b16f-981e4d856938"). InnerVolumeSpecName "error-404-isvc-febb8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:51:02.415634 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.415612 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e88b719-da61-49fb-b16f-981e4d856938" (UID: "2e88b719-da61-49fb-b16f-981e4d856938"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:51:02.415777 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.415759 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e88b719-da61-49fb-b16f-981e4d856938-kube-api-access-xr55h" (OuterVolumeSpecName: "kube-api-access-xr55h") pod "2e88b719-da61-49fb-b16f-981e4d856938" (UID: "2e88b719-da61-49fb-b16f-981e4d856938"). InnerVolumeSpecName "kube-api-access-xr55h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:51:02.440346 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.440315 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e88b719-da61-49fb-b16f-981e4d856938" containerID="875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e" exitCode=0 Apr 23 18:51:02.440496 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.440393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" event={"ID":"2e88b719-da61-49fb-b16f-981e4d856938","Type":"ContainerDied","Data":"875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e"} Apr 23 18:51:02.440496 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.440400 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" Apr 23 18:51:02.440496 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.440430 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh" event={"ID":"2e88b719-da61-49fb-b16f-981e4d856938","Type":"ContainerDied","Data":"216eaf70d92d57538fba0d4d28fbc20130b9242784b37fd14bef809def737d19"} Apr 23 18:51:02.440496 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.440450 2575 scope.go:117] "RemoveContainer" containerID="d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0" Apr 23 18:51:02.441708 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.441684 2575 generic.go:358] "Generic (PLEG): container finished" podID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerID="374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280" exitCode=0 Apr 23 18:51:02.441809 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.441747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" event={"ID":"181be6bb-ce95-4aa3-b958-c28a19b93cee","Type":"ContainerDied","Data":"374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280"} Apr 23 18:51:02.441809 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.441777 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" Apr 23 18:51:02.441809 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.441785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz" event={"ID":"181be6bb-ce95-4aa3-b958-c28a19b93cee","Type":"ContainerDied","Data":"fcaf78ee9189e4747a92a3ee3695ad3febff11bcc71e2752896e181097e705f5"} Apr 23 18:51:02.448473 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.448458 2575 scope.go:117] "RemoveContainer" containerID="875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e" Apr 23 18:51:02.455963 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.455947 2575 scope.go:117] "RemoveContainer" containerID="d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0" Apr 23 18:51:02.456189 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:51:02.456173 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0\": container with ID starting with d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0 not found: ID does not exist" containerID="d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0" Apr 23 18:51:02.456235 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.456196 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0"} err="failed to get container status \"d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0\": rpc error: code = NotFound desc = could not find container \"d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0\": container with ID starting with d86d81573592d783a551927b0851f07c2fb787d355e65bf9306dc5225f9f55e0 not found: ID does not exist" Apr 23 18:51:02.456235 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.456212 2575 scope.go:117] "RemoveContainer" containerID="875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e" Apr 23 18:51:02.456436 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:51:02.456422 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e\": container with ID starting with 875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e not found: ID does not exist" containerID="875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e" Apr 23 18:51:02.456478 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.456439 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e"} err="failed to get container status \"875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e\": rpc error: code = NotFound desc = could not find container \"875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e\": container with ID starting with 875c1c45a68801a593a2d04e1802dab27ad7353c6a88b73ea7a61fad1f40c48e not found: ID does not exist" Apr 23 18:51:02.456478 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.456453 2575 scope.go:117] "RemoveContainer" containerID="bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b" Apr 23 18:51:02.464168 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.464152 2575 scope.go:117] "RemoveContainer" containerID="374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280" Apr 23 18:51:02.464836 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.464814 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh"] Apr 23 18:51:02.468061 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.468042 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-febb8-predictor-545595fd9f-5vcsh"] Apr 23 18:51:02.471562 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.471546 2575 scope.go:117] "RemoveContainer" containerID="bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b" Apr 23 18:51:02.471809 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:51:02.471793 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b\": container with ID starting with bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b not found: ID does not exist" containerID="bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b" Apr 23 18:51:02.471856 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.471814 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b"} err="failed to get container status \"bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b\": rpc error: code = NotFound desc = could not find container \"bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b\": container with ID starting with bcffde200b93ba86129d45e5777eb6ffd7130b9c29cac0ff3a61b04df6fdfe1b not found: ID does not exist" Apr 23 18:51:02.471856 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.471828 2575 scope.go:117] "RemoveContainer" containerID="374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280" Apr 23 18:51:02.472073 ip-10-0-143-131 kubenswrapper[2575]: E0423 18:51:02.472055 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280\": container with ID starting with 374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280 not found: ID does not exist" containerID="374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280" Apr 23 18:51:02.472131 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.472082 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280"} err="failed to get container status \"374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280\": rpc error: code = NotFound desc = could not find container \"374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280\": container with ID starting with 374542e9218e81603b5ccd91e3bf61bebbdf0f4e05a4cc24e5c25469658ec280 not found: ID does not exist" Apr 23 18:51:02.477813 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.477790 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz"] Apr 23 18:51:02.482281 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.482260 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-febb8-predictor-7b967b5c88-c75hz"] Apr 23 18:51:02.515043 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.515010 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/2e88b719-da61-49fb-b16f-981e4d856938-kube-api-access-xr55h\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:51:02.515043 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.515036 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e88b719-da61-49fb-b16f-981e4d856938-proxy-tls\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:51:02.515043 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:02.515048 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-febb8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e88b719-da61-49fb-b16f-981e4d856938-error-404-isvc-febb8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-131.ec2.internal\" DevicePath \"\"" Apr 23 18:51:03.061126 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:03.061093 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" path="/var/lib/kubelet/pods/181be6bb-ce95-4aa3-b958-c28a19b93cee/volumes" Apr 23 18:51:03.061523 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:03.061510 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e88b719-da61-49fb-b16f-981e4d856938" path="/var/lib/kubelet/pods/2e88b719-da61-49fb-b16f-981e4d856938/volumes" Apr 23 18:51:24.931495 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931460 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jlv4/must-gather-kf6rk"] Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931772 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931782 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931797 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931802 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931810 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931815 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931820 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931825 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931832 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931837 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931844 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931849 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931854 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931858 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931863 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931868 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931876 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931899 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931905 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931910 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kube-rbac-proxy" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931916 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" Apr 23 18:51:24.931915 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931922 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931930 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931935 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931980 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931988 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.931996 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932003 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kube-rbac-proxy" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932010 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932016 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e88b719-da61-49fb-b16f-981e4d856938" containerName="kube-rbac-proxy" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932022 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4ae74b8-bf02-4534-b6aa-1a00bcd795a4" containerName="kube-rbac-proxy" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932028 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932032 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="97047322-b35c-453c-858d-7f2bbb7c8806" containerName="kube-rbac-proxy" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932039 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8dc829a-d241-4993-b315-4bb04d2cff26" containerName="kserve-container" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932045 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bff60ad1-536a-4cf2-832c-a469d7aba58b" containerName="kube-rbac-proxy" Apr 23 18:51:24.932585 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.932051 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="181be6bb-ce95-4aa3-b958-c28a19b93cee" containerName="kube-rbac-proxy" Apr 23 18:51:24.936634 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.936618 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:24.938652 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.938621 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8jlv4\"/\"default-dockercfg-kt7rv\"" Apr 23 18:51:24.938652 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.938635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8jlv4\"/\"kube-root-ca.crt\"" Apr 23 18:51:24.938814 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.938635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8jlv4\"/\"openshift-service-ca.crt\"" Apr 23 18:51:24.943086 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:24.943066 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jlv4/must-gather-kf6rk"] Apr 23 18:51:25.009872 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.009839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8r5l\" (UniqueName: \"kubernetes.io/projected/5e325fb7-3e85-4362-b39f-7151fad7965f-kube-api-access-m8r5l\") pod \"must-gather-kf6rk\" (UID: \"5e325fb7-3e85-4362-b39f-7151fad7965f\") " pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.010059 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.009915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e325fb7-3e85-4362-b39f-7151fad7965f-must-gather-output\") pod \"must-gather-kf6rk\" (UID: \"5e325fb7-3e85-4362-b39f-7151fad7965f\") " pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.110766 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.110722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e325fb7-3e85-4362-b39f-7151fad7965f-must-gather-output\") pod \"must-gather-kf6rk\" (UID: \"5e325fb7-3e85-4362-b39f-7151fad7965f\") " pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.110935 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.110799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8r5l\" (UniqueName: \"kubernetes.io/projected/5e325fb7-3e85-4362-b39f-7151fad7965f-kube-api-access-m8r5l\") pod \"must-gather-kf6rk\" (UID: \"5e325fb7-3e85-4362-b39f-7151fad7965f\") " pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.111084 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.111063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e325fb7-3e85-4362-b39f-7151fad7965f-must-gather-output\") pod \"must-gather-kf6rk\" (UID: \"5e325fb7-3e85-4362-b39f-7151fad7965f\") " pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.118862 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.118835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8r5l\" (UniqueName: \"kubernetes.io/projected/5e325fb7-3e85-4362-b39f-7151fad7965f-kube-api-access-m8r5l\") pod \"must-gather-kf6rk\" (UID: \"5e325fb7-3e85-4362-b39f-7151fad7965f\") " pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.245813 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.245725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jlv4/must-gather-kf6rk" Apr 23 18:51:25.363746 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.363723 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jlv4/must-gather-kf6rk"] Apr 23 18:51:25.366165 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:51:25.366139 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e325fb7_3e85_4362_b39f_7151fad7965f.slice/crio-03d77481d92c9f393925b2f576e586b2bf2b8ff15ca850ccceb7031c1867f7e3 WatchSource:0}: Error finding container 03d77481d92c9f393925b2f576e586b2bf2b8ff15ca850ccceb7031c1867f7e3: Status 404 returned error can't find the container with id 03d77481d92c9f393925b2f576e586b2bf2b8ff15ca850ccceb7031c1867f7e3 Apr 23 18:51:25.367974 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.367955 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:51:25.512459 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:25.512366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/must-gather-kf6rk" event={"ID":"5e325fb7-3e85-4362-b39f-7151fad7965f","Type":"ContainerStarted","Data":"03d77481d92c9f393925b2f576e586b2bf2b8ff15ca850ccceb7031c1867f7e3"} Apr 23 18:51:26.518069 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:26.517970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/must-gather-kf6rk" event={"ID":"5e325fb7-3e85-4362-b39f-7151fad7965f","Type":"ContainerStarted","Data":"22d47c5aa396c200bc69107a7b6b3b1ad20ce3de9e58edd2f83ed581d1e2b22c"} Apr 23 18:51:26.518069 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:26.518017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/must-gather-kf6rk" event={"ID":"5e325fb7-3e85-4362-b39f-7151fad7965f","Type":"ContainerStarted","Data":"4a121e223c54648a75be0fb560776c88f0eb7498654ba60704230dbb699310a4"} Apr 23 18:51:27.664401 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:27.664376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-j744d_c5623e36-755b-417a-a6af-21d001c67630/global-pull-secret-syncer/0.log" Apr 23 18:51:27.799821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:27.799787 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q429h_bfcf2bd4-6bd8-4072-a1ed-956e23cf9972/konnectivity-agent/0.log" Apr 23 18:51:27.872082 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:27.872046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-131.ec2.internal_79e815642fa2b5d10caaca9635cb7db2/haproxy/0.log" Apr 23 18:51:31.312913 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:31.312862 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qrm2m_cbf092c0-733e-4d6b-a240-9d95ac93022a/cluster-monitoring-operator/0.log" Apr 23 18:51:31.433193 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:31.433167 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-667bd85767-4v446_eec30db5-c0af-4aa9-b807-89e196d3f094/metrics-server/0.log" Apr 23 18:51:31.700302 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:31.700254 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xnt4q_93846088-277d-4894-b6ef-cc87f01ad6fa/node-exporter/0.log" Apr 23 18:51:31.722702 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:31.722667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xnt4q_93846088-277d-4894-b6ef-cc87f01ad6fa/kube-rbac-proxy/0.log" Apr 23 18:51:31.747451 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:31.747427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xnt4q_93846088-277d-4894-b6ef-cc87f01ad6fa/init-textfile/0.log" Apr 23 18:51:32.126224 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.126122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f55d95dbb-jfdvg_df4f2af2-d944-4a00-b116-126b71c159dc/telemeter-client/0.log" Apr 23 18:51:32.153747 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.153723 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f55d95dbb-jfdvg_df4f2af2-d944-4a00-b116-126b71c159dc/reload/0.log" Apr 23 18:51:32.177821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.177791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f55d95dbb-jfdvg_df4f2af2-d944-4a00-b116-126b71c159dc/kube-rbac-proxy/0.log" Apr 23 18:51:32.216492 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.216459 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/thanos-query/0.log" Apr 23 18:51:32.240327 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.240294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy-web/0.log" Apr 23 18:51:32.265739 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.265706 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy/0.log" Apr 23 18:51:32.294071 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.294046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/prom-label-proxy/0.log" Apr 23 18:51:32.324687 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.324661 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy-rules/0.log" Apr 23 18:51:32.354654 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:32.354608 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7556c7f998-x77kb_e108b2f0-3753-4733-bedd-f9e769bbb345/kube-rbac-proxy-metrics/0.log" Apr 23 18:51:33.910062 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:33.910014 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/2.log" Apr 23 18:51:33.914638 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:33.914608 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dznpd_50ae4d89-cf03-479c-935d-c2b46bb0082b/console-operator/3.log" Apr 23 18:51:34.284028 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.284001 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84bf9f9f94-58fqq_69bbed3b-a24c-4cea-b6ad-f05dd97236eb/console/0.log" Apr 23 18:51:34.320463 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.320432 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zr7bt_52c4d929-5543-48fc-9138-b1149888b4e1/download-server/0.log" Apr 23 18:51:34.502157 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.502098 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8jlv4/must-gather-kf6rk" podStartSLOduration=9.675786355 podStartE2EDuration="10.50207805s" podCreationTimestamp="2026-04-23 18:51:24 +0000 UTC" firstStartedPulling="2026-04-23 18:51:25.368079114 +0000 UTC m=+3472.914054458" lastFinishedPulling="2026-04-23 18:51:26.194370808 +0000 UTC m=+3473.740346153" observedRunningTime="2026-04-23 18:51:26.53381461 +0000 UTC m=+3474.079789976" watchObservedRunningTime="2026-04-23 18:51:34.50207805 +0000 UTC m=+3482.048053416" Apr 23 18:51:34.502990 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.502950 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8"] Apr 23 18:51:34.507921 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.507878 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8"] Apr 23 18:51:34.508063 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.508012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.605821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.605739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-sys\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.605821 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.605793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-lib-modules\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.606047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.605823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-podres\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.606047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.605922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-proc\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.606047 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.605983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqmc\" (UniqueName: \"kubernetes.io/projected/28aaedd2-4330-4800-a098-69e84e6f7ab9-kube-api-access-jzqmc\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707240 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-lib-modules\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-podres\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-proc\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqmc\" (UniqueName: \"kubernetes.io/projected/28aaedd2-4330-4800-a098-69e84e6f7ab9-kube-api-access-jzqmc\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707394 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-sys\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707539 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-lib-modules\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707539 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-proc\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707539 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-podres\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.707539 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.707461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aaedd2-4330-4800-a098-69e84e6f7ab9-sys\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.714647 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.714615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqmc\" (UniqueName: \"kubernetes.io/projected/28aaedd2-4330-4800-a098-69e84e6f7ab9-kube-api-access-jzqmc\") pod \"perf-node-gather-daemonset-x57x8\" (UID: \"28aaedd2-4330-4800-a098-69e84e6f7ab9\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.818552 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.818518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:34.954264 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:34.954203 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8"] Apr 23 18:51:34.956532 ip-10-0-143-131 kubenswrapper[2575]: W0423 18:51:34.956502 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod28aaedd2_4330_4800_a098_69e84e6f7ab9.slice/crio-3992ae7403425e2351831dbd769331c4da07f3529e278bb9643fb4fffa384449 WatchSource:0}: Error finding container 3992ae7403425e2351831dbd769331c4da07f3529e278bb9643fb4fffa384449: Status 404 returned error can't find the container with id 3992ae7403425e2351831dbd769331c4da07f3529e278bb9643fb4fffa384449 Apr 23 18:51:35.454750 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.454722 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-swk4j_a57787a9-765e-4e66-b8bd-e8c50eaf8977/dns/0.log" Apr 23 18:51:35.478788 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.478760 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-swk4j_a57787a9-765e-4e66-b8bd-e8c50eaf8977/kube-rbac-proxy/0.log" Apr 23 18:51:35.542282 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.542251 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-svgk4_9bb70f31-0e60-414a-ac60-8535df8b1ed1/dns-node-resolver/0.log" Apr 23 18:51:35.554357 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.554329 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" event={"ID":"28aaedd2-4330-4800-a098-69e84e6f7ab9","Type":"ContainerStarted","Data":"b9f2613be386b2bc8ed44fa6aba88fe759f2c19459f5b38491a5abd0767dd69b"} Apr 23 18:51:35.554357 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.554360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" event={"ID":"28aaedd2-4330-4800-a098-69e84e6f7ab9","Type":"ContainerStarted","Data":"3992ae7403425e2351831dbd769331c4da07f3529e278bb9643fb4fffa384449"} Apr 23 18:51:35.554530 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.554497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:35.569834 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.569782 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" podStartSLOduration=1.569765036 podStartE2EDuration="1.569765036s" podCreationTimestamp="2026-04-23 18:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:51:35.568871773 +0000 UTC m=+3483.114847138" watchObservedRunningTime="2026-04-23 18:51:35.569765036 +0000 UTC m=+3483.115740404" Apr 23 18:51:35.954088 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:35.954061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-c45cc5f59-s6pc2_59da388f-0cf0-4d97-8756-b51b61e9316c/registry/0.log" Apr 23 18:51:36.019444 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:36.019417 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vnvg4_69d66352-58de-4bf6-88d8-b5603ccbe8af/node-ca/0.log" Apr 23 18:51:36.698349 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:36.698324 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79996f959d-wvgtd_9ae6b02c-1ffb-4822-950c-d4782e47731b/router/0.log" Apr 23 18:51:37.073022 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:37.072953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tld98_b6a9e60b-54f2-4766-83e3-2d825df0b4f0/serve-healthcheck-canary/0.log" Apr 23 18:51:37.427558 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:37.427530 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zmmnp_ffa5d0d2-4b4f-4472-b414-6aefb709735c/insights-operator/0.log" Apr 23 18:51:37.430220 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:37.430199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zmmnp_ffa5d0d2-4b4f-4472-b414-6aefb709735c/insights-operator/1.log" Apr 23 18:51:37.450768 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:37.450744 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5fjbp_5761c70c-3274-4099-8b36-5bfb025c9803/kube-rbac-proxy/0.log" Apr 23 18:51:37.470605 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:37.470584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5fjbp_5761c70c-3274-4099-8b36-5bfb025c9803/exporter/0.log" Apr 23 18:51:37.495238 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:37.495215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5fjbp_5761c70c-3274-4099-8b36-5bfb025c9803/extractor/0.log" Apr 23 18:51:41.022709 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.022632 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cdcd68c9d-hg6cm"] Apr 23 18:51:41.029616 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.029593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.035099 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.035070 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdcd68c9d-hg6cm"] Apr 23 18:51:41.068698 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.068663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-trusted-ca-bundle\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.068867 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.068710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppxm\" (UniqueName: \"kubernetes.io/projected/bf897951-81b2-467e-ab20-c0b24b56c17b-kube-api-access-nppxm\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.068867 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.068740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-console-config\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.069080 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.068907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf897951-81b2-467e-ab20-c0b24b56c17b-console-oauth-config\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.069080 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.068977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf897951-81b2-467e-ab20-c0b24b56c17b-console-serving-cert\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.069080 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.069013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-service-ca\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.069080 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.069041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-oauth-serving-cert\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.171707 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.171664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-trusted-ca-bundle\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.171958 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.171939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nppxm\" (UniqueName: \"kubernetes.io/projected/bf897951-81b2-467e-ab20-c0b24b56c17b-kube-api-access-nppxm\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.172021 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.171981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-console-config\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.172075 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf897951-81b2-467e-ab20-c0b24b56c17b-console-oauth-config\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.172126 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf897951-81b2-467e-ab20-c0b24b56c17b-console-serving-cert\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.172179 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-service-ca\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.172179 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-oauth-serving-cert\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.172836 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-trusted-ca-bundle\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.173030 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-oauth-serving-cert\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.173030 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.172831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-console-config\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.173498 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.173472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf897951-81b2-467e-ab20-c0b24b56c17b-service-ca\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.174727 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.174704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf897951-81b2-467e-ab20-c0b24b56c17b-console-oauth-config\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.174823 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.174804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf897951-81b2-467e-ab20-c0b24b56c17b-console-serving-cert\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.180149 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.180129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppxm\" (UniqueName: \"kubernetes.io/projected/bf897951-81b2-467e-ab20-c0b24b56c17b-kube-api-access-nppxm\") pod \"console-cdcd68c9d-hg6cm\" (UID: \"bf897951-81b2-467e-ab20-c0b24b56c17b\") " pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.341061 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.340532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdcd68c9d-hg6cm" Apr 23 18:51:41.482918 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.482873 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdcd68c9d-hg6cm"] Apr 23 18:51:41.569166 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.569140 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-x57x8" Apr 23 18:51:41.576201 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.576177 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdcd68c9d-hg6cm" event={"ID":"bf897951-81b2-467e-ab20-c0b24b56c17b","Type":"ContainerStarted","Data":"24af2605d9b6c1a36b488fb881e85c5347c167fbb7eb7c65bf16104356b03616"} Apr 23 18:51:41.576309 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.576211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdcd68c9d-hg6cm" event={"ID":"bf897951-81b2-467e-ab20-c0b24b56c17b","Type":"ContainerStarted","Data":"102e30e0043b3611b63a52d437086481d6009f134b9bb91de6a6f05f8c893f85"} Apr 23 18:51:41.624833 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:41.624714 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cdcd68c9d-hg6cm" podStartSLOduration=0.624695259 podStartE2EDuration="624.695259ms" podCreationTimestamp="2026-04-23 18:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:51:41.62336216 +0000 UTC m=+3489.169337525" watchObservedRunningTime="2026-04-23 18:51:41.624695259 +0000 UTC m=+3489.170670626" Apr 23 18:51:44.206567 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:44.206538 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6wwq7_600a1b41-9d90-4293-9c00-69bd81f7363a/migrator/0.log" Apr 23 18:51:44.232102 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:44.232028 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6wwq7_600a1b41-9d90-4293-9c00-69bd81f7363a/graceful-termination/0.log" Apr 23 18:51:44.591170 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:44.591103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k5xtc_c4825f9f-c326-4675-901b-635a9bb75ddc/kube-storage-version-migrator-operator/1.log" Apr 23 18:51:44.592071 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:44.592054 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k5xtc_c4825f9f-c326-4675-901b-635a9bb75ddc/kube-storage-version-migrator-operator/0.log" Apr 23 18:51:45.751541 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.751512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/kube-multus-additional-cni-plugins/0.log" Apr 23 18:51:45.775089 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.775065 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/egress-router-binary-copy/0.log" Apr 23 18:51:45.799463 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.799435 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/cni-plugins/0.log" Apr 23 18:51:45.823384 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.823363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/bond-cni-plugin/0.log" Apr 23 18:51:45.845223 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.845201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/routeoverride-cni/0.log" Apr 23 18:51:45.866166 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.866145 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/whereabouts-cni-bincopy/0.log" Apr 23 18:51:45.890290 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:45.890261 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-59dv8_ea1db72a-418a-4e3b-89f7-818e445eed4f/whereabouts-cni/0.log" Apr 23 18:51:46.075775 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:46.075745 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tk5p8_33d2ad31-e97b-4649-8936-e50045eda195/kube-multus/0.log" Apr 23 18:51:46.096014 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:46.095988 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6lhps_6201ae7f-dbb7-4347-a698-89a65766225e/network-metrics-daemon/0.log" Apr 23 18:51:46.116034 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:46.116017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6lhps_6201ae7f-dbb7-4347-a698-89a65766225e/kube-rbac-proxy/0.log" Apr 23 18:51:46.922787 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:46.922755 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/ovn-controller/0.log" Apr 23 18:51:46.958677 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:46.958648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/ovn-acl-logging/0.log" Apr 23 18:51:46.980332 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:46.980302 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/kube-rbac-proxy-node/0.log" Apr 23 18:51:47.000615 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:47.000590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:51:47.026532 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:47.026502 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/northd/0.log" Apr 23 18:51:47.062504 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:47.062477 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/nbdb/0.log" Apr 23 18:51:47.105174 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:47.105143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/sbdb/0.log" Apr 23 18:51:47.223358 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:47.223294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldzqx_e61666b2-6500-4374-a876-375fa31848c7/ovnkube-controller/0.log" Apr 23 18:51:48.883739 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:48.883715 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bc6lb_51e88714-7c29-490d-b1b7-79b96331f10c/network-check-target-container/0.log" Apr 23 18:51:49.806447 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:49.806419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-n2jj4_617aedf6-9b98-49f4-9aaa-9499484faf5f/iptables-alerter/0.log" Apr 23 18:51:50.494327 ip-10-0-143-131 kubenswrapper[2575]: I0423 18:51:50.494300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vvgm9_e107839a-5af3-4754-a936-a2da378bc464/tuned/0.log"