Apr 24 21:16:17.462700 ip-10-0-128-21 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:17.935435 ip-10-0-128-21 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:17.935435 ip-10-0-128-21 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:17.935435 ip-10-0-128-21 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:17.935435 ip-10-0-128-21 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:17.935435 ip-10-0-128-21 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:17.939100 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.938995 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:17.944228 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944210 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:17.944228 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944227 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944232 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944235 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944239 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944242 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944245 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944248 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944250 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944253 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944257 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944260 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944263 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944266 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944269 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944272 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944275 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944278 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944281 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944283 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944292 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:17.944301 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944295 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944298 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944301 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944303 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944306 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944309 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944312 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944315 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944318 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944320 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944323 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944327 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944329 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944332 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944336 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944340 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944343 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944346 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944349 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:17.944786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944365 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944370 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944373 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944376 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944379 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944382 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944385 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944387 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944390 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944392 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944396 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944398 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944401 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944403 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944406 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944410 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944413 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944415 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944418 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944421 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:17.945250 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944424 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944426 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944429 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944431 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944435 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944438 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944443 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944445 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944448 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944450 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944453 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944455 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944458 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944460 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944463 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944466 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944468 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944471 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944473 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:17.945739 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944475 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944478 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944480 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944483 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944486 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944488 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944491 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944877 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944882 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944885 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944888 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944890 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944893 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944896 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944899 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944901 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944904 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944907 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944910 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:17.946180 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944916 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944921 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944925 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944928 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944931 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944934 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944936 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944939 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944941 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944944 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944947 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944950 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944953 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944955 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944958 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944961 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944964 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944966 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944969 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944972 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:17.946684 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944974 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944978 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944981 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944983 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944986 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.944990 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945003 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945007 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945009 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945012 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945014 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945017 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945019 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945022 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945025 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945027 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945030 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945032 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945035 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945038 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:17.947172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945041 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945043 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945046 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945048 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945050 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945053 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945056 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945058 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945060 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945063 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945065 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945068 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945070 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945073 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945075 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945078 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945082 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945086 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945088 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:17.947656 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945091 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945093 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945095 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945098 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945100 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945103 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945105 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945108 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945111 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945113 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945115 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945122 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945125 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945127 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.945130 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946430 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946444 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946452 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946457 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946462 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946465 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:17.948145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946469 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946474 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946478 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946481 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946485 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946489 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946492 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946495 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946498 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946501 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946504 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946506 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946509 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946514 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946516 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946519 2573 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946522 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946526 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946530 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946533 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946537 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946540 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946543 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946546 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:17.948670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946549 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946552 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946555 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946559 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946562 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946565 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946568 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946571 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946574 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946579 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946582 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946586 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946589 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946591 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946596 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946598 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946602 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946604 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946607 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946610 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946613 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946615 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946619 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946621 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946624 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:17.949228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946628 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946631 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946634 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946638 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946641 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946644 2573 flags.go:64] FLAG: --help="false" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946648 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-128-21.ec2.internal" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946651 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946654 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946657 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946660 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946664 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946667 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946670 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946673 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946675 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946678 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946681 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946684 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946687 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946690 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946693 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946696 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946699 2573 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:17.949909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946702 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946705 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946708 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946713 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946716 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946719 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946721 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946724 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946728 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946730 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946733 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946738 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946741 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946746 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946749 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946752 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946755 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946758 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946761 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946764 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946767 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946775 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946778 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946781 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:17.950496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946784 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946787 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946793 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946797 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946800 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946803 2573 flags.go:64] FLAG: --port="10250" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946806 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946809 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-080dfee6f43559ae7" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946812 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946815 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946818 2573 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946821 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946824 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946828 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946831 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946834 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946836 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946840 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946843 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946846 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946848 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946852 2573 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946855 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946858 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946861 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:17.951058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946864 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946867 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946870 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946874 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946877 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946880 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946883 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946885 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946888 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946891 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946894 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946897 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946902 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946905 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946908 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946912 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946915 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946918 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946920 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946923 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946927 2573 flags.go:64] FLAG: --v="2" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946932 2573 flags.go:64] FLAG: --version="false" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946936 2573 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946940 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.946943 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:17.951704 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947040 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947043 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947046 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947049 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947055 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947058 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947061 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947063 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947066 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947068 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947071 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947073 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947076 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947078 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947081 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947084 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947086 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947089 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947091 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947094 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:17.952345 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947096 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947100 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947102 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947105 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947107 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947110 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947113 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947115 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947118 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947120 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947123 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947125 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947128 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947130 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947133 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947135 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947140 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947143 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947145 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947148 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:17.952893 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947150 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947152 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947157 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947159 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947162 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947165 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947167 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947170 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947172 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947175 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947177 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947180 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947182 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947185 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947188 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947190 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947193 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947195 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947198 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947200 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:17.953407 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947202 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947205 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947207 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947209 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947212 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947214 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947216 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947219 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947223 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947226 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947229 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947231 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947234 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947236 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947241 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947243 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947245 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947249 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947253 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:17.953876 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947256 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947259 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947262 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947264 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947267 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947269 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.947273 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:17.954334 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.947282 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:17.954650 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.954627 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:17.954683 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.954651 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:17.954713 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954700 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:17.954713 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954706 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:17.954713 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954709 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:17.954713 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954713 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954716 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954720 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954723 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954726 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954728 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954732 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954735 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954737 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954740 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954744 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954747 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954749 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954752 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954754 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954757 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954759 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954762 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954764 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954766 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:17.954807 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954769 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954772 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954775 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954777 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954779 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954782 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954785 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954788 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954790 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954793 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954795 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954798 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954801 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954803 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954807 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954810 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954813 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954816 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954819 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954822 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:17.955314 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954824 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954827 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954830 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954833 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954836 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954838 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954841 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954843 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954846 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954848 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954850 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954853 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954856 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954858 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954861 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954863 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954866 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954870 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954874 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:17.955830 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954876 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954879 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954882 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954884 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954886 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954889 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954891 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954895 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954898 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954900 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954903 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954905 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954908 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954910 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954912 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954915 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954917 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954920 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954924 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:17.956284 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954927 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954930 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954932 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954935 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.954937 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.954942 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955041 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955046 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955049 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955052 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955055 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955057 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955060 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955063 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955066 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955068 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:17.956773 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955071 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955073 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955076 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955078 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955081 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955085 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955089 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955092 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955095 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955098 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955101 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955104 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955106 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955109 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955112 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955115 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955118 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955121 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955123 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:17.957176 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955126 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955128 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955131 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955134 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955136 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955139 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955141 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955144 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955146 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955149 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955151 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955153 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955156 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955158 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955160 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955163 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955166 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955168 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955171 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955174 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:17.957648 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955176 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955179 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955182 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955185 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955188 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955190 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955193 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955195 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955198 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955201 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955203 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955205 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955208 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955210 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955213 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955215 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955218 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955220 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955222 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:17.958148 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955225 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955228 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955230 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955233 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955235 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955238 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955240 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955243 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955245 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955248 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955250 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955253 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955255 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955258 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955260 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955263 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955265 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:17.958625 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:17.955267 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:17.959039 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.955272 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:17.959039 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.956003 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:17.959039 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.957984 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:17.959039 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.958933 2573 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:17.959158 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.959027 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:17.959158 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.959072 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:17.987979 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.987954 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:17.995856 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:17.995829 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:18.010724 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.010698 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:18.017655 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.017631 2573 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:18.019340 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.019309 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:18.019830 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.019811 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:18.023996 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.023964 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 d1655536-c2b9-4d1c-8ea6-8e77e4b0b861:/dev/nvme0n1p4 d7c89fa8-0500-4dc2-a439-a8add1fb13b5:/dev/nvme0n1p3] Apr 24 21:16:18.024073 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.023995 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:18.030465 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.030326 2573 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:18.028479064 +0000 UTC m=+0.434320507 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3201073 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d38e79c2467d80dab08283c78b0f8 SystemUUID:ec2d38e7-9c24-67d8-0dab-08283c78b0f8 BootID:8dfcc75f-7eb3-4042-87cc-d9b99a1e3cdb Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a4:e3:9b:4d:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a4:e3:9b:4d:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:0b:75:30:29:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:18.030465 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.030457 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:18.030584 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.030548 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:18.032582 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.032556 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:18.032736 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.032584 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-21.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:18.032782 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.032743 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:18.032782 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.032752 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:18.032782 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.032766 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:18.033496 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.033484 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:18.035017 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.035005 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:18.035311 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.035302 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:18.038090 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.038078 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:18.038139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.038096 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:18.038139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.038114 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:18.038139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.038123 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:18.038211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.038140 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:18.039487 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.039474 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:18.039554 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.039491 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:18.042874 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.042860 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:18.047626 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.047519 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:18.049726 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.049629 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:18.049819 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.049773 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:18.049819 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.049798 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:18.049908 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.049825 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:18.050031 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050013 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:18.050085 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050038 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:18.050085 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050048 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:18.050085 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050057 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:18.050085 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050070 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:18.050085 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050079 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:18.050269 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050108 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:18.050269 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.050125 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:18.051979 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.051965 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:18.052043 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.051983 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:18.052722 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.052696 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-21.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:18.052722 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.052692 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:18.055819 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.055804 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:18.055900 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.055843 2573 server.go:1295] "Started kubelet" Apr 24 21:16:18.055983 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.055938 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:18.056036 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.055986 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:18.056084 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.056057 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:18.056816 ip-10-0-128-21 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:18.057870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.057766 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:18.058653 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.058635 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:18.064069 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.064051 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:18.064500 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.064479 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-21.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:18.064594 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.064574 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:18.065379 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.065351 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:18.065379 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.065366 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:18.065508 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.065385 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:18.065508 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.065485 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:18.065508 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.065490 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:18.065679 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.065662 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.065877 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.064492 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-21.ec2.internal.18a9678bc9a985b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-21.ec2.internal,UID:ip-10-0-128-21.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-21.ec2.internal,},FirstTimestamp:2026-04-24 21:16:18.055816632 +0000 UTC m=+0.461658075,LastTimestamp:2026-04-24 21:16:18.055816632 +0000 UTC m=+0.461658075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-21.ec2.internal,}" Apr 24 21:16:18.066866 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.066852 2573 factory.go:55] Registering systemd factory Apr 24 21:16:18.066946 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.066885 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:18.067134 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067120 2573 factory.go:153] Registering CRI-O factory Apr 24 21:16:18.067185 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067137 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:18.067228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067183 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:18.067228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067206 2573 factory.go:103] Registering Raw factory Apr 24 21:16:18.067228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067219 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:18.067393 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067264 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cjvrt" Apr 24 21:16:18.067554 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.067528 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-21.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:16:18.067641 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.067627 2573 manager.go:319] Starting recovery of all containers Apr 24 21:16:18.067914 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.067887 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:18.067914 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.067885 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:16:18.073231 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.073174 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:18.077021 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.076827 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cjvrt" Apr 24 21:16:18.080802 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.080785 2573 manager.go:324] Recovery completed Apr 24 21:16:18.085968 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.085948 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:18.088540 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.088521 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:18.088631 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.088551 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:18.088631 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.088582 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:18.089131 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.089119 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:18.089131 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.089130 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:18.089198 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.089146 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:18.090553 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.090483 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-21.ec2.internal.18a9678bcb9cd18e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-21.ec2.internal,UID:ip-10-0-128-21.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-21.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-21.ec2.internal,},FirstTimestamp:2026-04-24 21:16:18.08853851 +0000 UTC m=+0.494379952,LastTimestamp:2026-04-24 21:16:18.08853851 +0000 UTC m=+0.494379952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-21.ec2.internal,}" Apr 24 21:16:18.091459 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.091447 2573 policy_none.go:49] "None policy: Start" Apr 24 21:16:18.091499 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.091462 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:18.091499 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.091472 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:18.135129 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135111 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.135150 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135165 2573 server.go:85] "Starting device plugin registration server" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135570 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135583 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135685 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135939 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.135950 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.136375 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:18.142124 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.136409 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.205812 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.205730 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:18.205812 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.205764 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:18.205812 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.205784 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:18.205812 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.205791 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:18.206068 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.205823 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:18.208499 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.208477 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:18.236736 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.236706 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:18.238025 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.238009 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:18.238084 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.238042 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:18.238084 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.238058 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:18.238145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.238088 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.246994 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.246975 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.247047 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.247000 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-21.ec2.internal\": node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.267530 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.267506 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.306407 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.306343 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal"] Apr 24 21:16:18.306473 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.306457 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:18.308086 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.308070 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:18.308143 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.308104 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:18.308143 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.308118 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:18.309790 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.309776 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:18.309917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.309900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.309952 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.309939 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:18.310597 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.310578 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:18.310648 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.310611 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:18.310648 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.310624 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:18.310648 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.310634 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:18.310751 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.310658 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:18.310751 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.310668 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:18.311828 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.311813 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.311920 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.311839 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:18.312526 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.312510 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:18.312590 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.312538 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:18.312590 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.312552 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:18.326998 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.326968 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-21.ec2.internal\" not found" node="ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.330896 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.330878 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-21.ec2.internal\" not found" node="ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.366606 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.366571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd8ac9133f3c42502b0cb4b65d236ac9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal\" (UID: \"fd8ac9133f3c42502b0cb4b65d236ac9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.366719 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.366624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd8ac9133f3c42502b0cb4b65d236ac9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal\" (UID: \"fd8ac9133f3c42502b0cb4b65d236ac9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.366719 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.366650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46129ef0396a1fbd001318ae09f161a9-config\") pod \"kube-apiserver-proxy-ip-10-0-128-21.ec2.internal\" (UID: \"46129ef0396a1fbd001318ae09f161a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.367601 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.367574 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.467537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.467478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd8ac9133f3c42502b0cb4b65d236ac9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal\" (UID: \"fd8ac9133f3c42502b0cb4b65d236ac9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.467537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.467510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd8ac9133f3c42502b0cb4b65d236ac9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal\" (UID: \"fd8ac9133f3c42502b0cb4b65d236ac9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.467537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.467529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46129ef0396a1fbd001318ae09f161a9-config\") pod \"kube-apiserver-proxy-ip-10-0-128-21.ec2.internal\" (UID: \"46129ef0396a1fbd001318ae09f161a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.467669 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.467572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd8ac9133f3c42502b0cb4b65d236ac9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal\" (UID: \"fd8ac9133f3c42502b0cb4b65d236ac9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.467669 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.467624 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.467669 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.467632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd8ac9133f3c42502b0cb4b65d236ac9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal\" (UID: \"fd8ac9133f3c42502b0cb4b65d236ac9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.467669 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.467659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46129ef0396a1fbd001318ae09f161a9-config\") pod \"kube-apiserver-proxy-ip-10-0-128-21.ec2.internal\" (UID: \"46129ef0396a1fbd001318ae09f161a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.568188 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.568152 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.629460 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.629433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.633931 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.633908 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:18.668738 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.668686 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.769063 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.769035 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.869504 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.869466 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:18.958820 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.958782 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:18.959433 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:18.958952 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:18.970220 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:18.970189 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:19.064943 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.064919 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:19.070287 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:19.070260 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-21.ec2.internal\" not found" Apr 24 21:16:19.079629 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.079603 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:19.080323 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.080292 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:11:18 +0000 UTC" deadline="2027-10-30 10:57:27.55724226 +0000 UTC" Apr 24 21:16:19.080396 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.080323 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13285h41m8.476921492s" Apr 24 21:16:19.082028 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.082007 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:19.103664 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.103636 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:19.106172 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.106154 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qrq5c" Apr 24 21:16:19.112106 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.112081 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qrq5c" Apr 24 21:16:19.160673 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:19.160628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd8ac9133f3c42502b0cb4b65d236ac9.slice/crio-78653563fd1772bc60cce67d450f8cb321403ce8e1d6384c932b2a16c20a17ac WatchSource:0}: Error finding container 78653563fd1772bc60cce67d450f8cb321403ce8e1d6384c932b2a16c20a17ac: Status 404 returned error can't find the container with id 78653563fd1772bc60cce67d450f8cb321403ce8e1d6384c932b2a16c20a17ac Apr 24 21:16:19.165192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.165142 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" Apr 24 21:16:19.165253 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.165221 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:19.180102 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.180078 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:19.182198 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.182176 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" Apr 24 21:16:19.191873 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.191854 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:19.209009 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.208935 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" event={"ID":"46129ef0396a1fbd001318ae09f161a9","Type":"ContainerStarted","Data":"808b1afbdd26918ca280c6b0aeb6e2bd0b2e572c861ad7488438131024bf3b74"} Apr 24 21:16:19.209879 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.209857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" event={"ID":"fd8ac9133f3c42502b0cb4b65d236ac9","Type":"ContainerStarted","Data":"78653563fd1772bc60cce67d450f8cb321403ce8e1d6384c932b2a16c20a17ac"} Apr 24 21:16:19.623167 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.623116 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:19.867749 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:19.867704 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:20.039285 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.039252 2573 apiserver.go:52] "Watching apiserver" Apr 24 21:16:20.047390 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.047350 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:20.047730 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.047704 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mp2nj","openshift-ovn-kubernetes/ovnkube-node-rz2jk","openshift-dns/node-resolver-cstvp","openshift-image-registry/node-ca-9crbq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal","openshift-multus/multus-additional-cni-plugins-cph25","openshift-multus/network-metrics-daemon-h5m79","openshift-network-diagnostics/network-check-target-tzpnt","openshift-network-operator/iptables-alerter-m7gwx","kube-system/konnectivity-agent-ln7wc","kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw","openshift-cluster-node-tuning-operator/tuned-5pbst"] Apr 24 21:16:20.049877 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.049856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.049980 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.049944 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:20.051107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.051087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.052174 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.052156 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.053265 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.053237 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:20.053374 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.053335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.054261 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054239 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:20.054261 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054254 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.054426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054287 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:20.054426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054307 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vhv4z\"" Apr 24 21:16:20.054426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054243 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.054569 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.054620 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054594 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:20.054702 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054685 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.054783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.054768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wxxgs\"" Apr 24 21:16:20.055444 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.055424 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.055528 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.055449 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qxxrn\"" Apr 24 21:16:20.055528 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.055467 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.055528 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.055512 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:20.056435 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.056414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.057820 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.057805 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.057911 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.057883 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:20.057985 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.057954 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:20.058338 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.058298 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.058338 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.058318 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:20.058532 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.058507 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vddq6\"" Apr 24 21:16:20.058640 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.058589 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.058640 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.058615 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:20.059168 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.059152 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:20.059424 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.059404 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.059869 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.059850 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:20.059962 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.059935 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-st5tl\"" Apr 24 21:16:20.061062 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.061042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.061207 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.061187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.061539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.061517 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hqqq9\"" Apr 24 21:16:20.061622 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.061604 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.061738 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.061660 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:20.062555 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.062533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.062941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.062923 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:20.063174 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.063159 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:20.063774 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.063679 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d98d6\"" Apr 24 21:16:20.066186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.064334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.066186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.064337 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rcvqf\"" Apr 24 21:16:20.066186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.065254 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.066186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.065905 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.066186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.066131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:20.066870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.066572 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:20.067495 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.067311 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:20.067495 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.067445 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8ldq2\"" Apr 24 21:16:20.067635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.067501 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:20.075815 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-ovn\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.075926 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-tmp\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.075926 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-kubelet\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.075926 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-run-netns\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.075926 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-systemd\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.075926 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-os-release\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075948 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtkj\" (UniqueName: \"kubernetes.io/projected/9835b143-e40c-4455-9924-5824b457a60a-kube-api-access-bbtkj\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.075997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-cni-binary-copy\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m424d\" (UniqueName: \"kubernetes.io/projected/239caad5-0402-47f0-8e15-7f5d02343638-kube-api-access-m424d\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-cnibin\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.076164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-kubelet\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-hostroot\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-conf-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-lib-modules\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-socket-dir-parent\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31fbbb71-5394-4f60-8de2-cc5dc970ab35-hosts-file\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076327 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea-konnectivity-ca\") pod \"konnectivity-agent-ln7wc\" (UID: \"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea\") " pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysconfig\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-env-overrides\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz5wn\" (UniqueName: \"kubernetes.io/projected/88a73d58-a99e-49c1-9821-a06593a8b35e-kube-api-access-wz5wn\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-sys\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.076512 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-slash\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-cni-bin\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-ovnkube-config\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l88d6\" (UniqueName: \"kubernetes.io/projected/abb55075-ce71-45a8-8ef8-400976104389-kube-api-access-l88d6\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea-agent-certs\") pod \"konnectivity-agent-ln7wc\" (UID: \"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea\") " pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-var-lib-kubelet\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-log-socket\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-k8s-cni-cncf-io\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd5pw\" (UniqueName: \"kubernetes.io/projected/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-kube-api-access-rd5pw\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-systemd-units\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-etc-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-cnibin\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-cni-binary-copy\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.076981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-multus-certs\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.077016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-node-log\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/880ca20b-7732-4709-9f0a-9013465ca003-host-slash\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-device-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-modprobe-d\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfw9\" (UniqueName: \"kubernetes.io/projected/31fbbb71-5394-4f60-8de2-cc5dc970ab35-kube-api-access-2xfw9\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-etc-kubernetes\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077195 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysctl-conf\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077225 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-run\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-daemon-config\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvwr\" (UniqueName: \"kubernetes.io/projected/880ca20b-7732-4709-9f0a-9013465ca003-kube-api-access-xsvwr\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88a73d58-a99e-49c1-9821-a06593a8b35e-host\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysctl-d\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-systemd\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.077706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-host\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-tuned\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-system-cni-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-cni-multus\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077500 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-registration-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzv8n\" (UniqueName: \"kubernetes.io/projected/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-kube-api-access-dzv8n\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-run-ovn-kubernetes\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/abb55075-ce71-45a8-8ef8-400976104389-ovn-node-metrics-cert\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-ovnkube-script-lib\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-netns\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88a73d58-a99e-49c1-9821-a06593a8b35e-serviceca\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-cni-netd\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31fbbb71-5394-4f60-8de2-cc5dc970ab35-tmp-dir\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-system-cni-dir\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.078439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/880ca20b-7732-4709-9f0a-9013465ca003-iptables-alerter-script\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-etc-selinux\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-sys-fs\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-cni-bin\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjzf2\" (UniqueName: \"kubernetes.io/projected/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-kube-api-access-rjzf2\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-socket-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-kubernetes\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.077985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-var-lib-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.078011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-os-release\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.079139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.078039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-cni-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.112800 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.112742 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:19 +0000 UTC" deadline="2027-11-02 07:22:33.732936313 +0000 UTC" Apr 24 21:16:20.112800 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.112772 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13354h6m13.6201679s" Apr 24 21:16:20.178520 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-etc-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-cnibin\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-cnibin\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-cni-binary-copy\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-multus-certs\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-etc-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178650 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-node-log\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/880ca20b-7732-4709-9f0a-9013465ca003-host-slash\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-multus-certs\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178691 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.178701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-node-log\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-device-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178727 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/880ca20b-7732-4709-9f0a-9013465ca003-host-slash\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-modprobe-d\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-device-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfw9\" (UniqueName: \"kubernetes.io/projected/31fbbb71-5394-4f60-8de2-cc5dc970ab35-kube-api-access-2xfw9\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-etc-kubernetes\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.178857 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-modprobe-d\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysctl-conf\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-etc-kubernetes\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-run\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-run\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.178993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysctl-conf\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-daemon-config\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:20.179114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvwr\" (UniqueName: \"kubernetes.io/projected/880ca20b-7732-4709-9f0a-9013465ca003-kube-api-access-xsvwr\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.179152 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.679103781 +0000 UTC m=+3.084945228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88a73d58-a99e-49c1-9821-a06593a8b35e-host\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysctl-d\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-cni-binary-copy\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-systemd\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88a73d58-a99e-49c1-9821-a06593a8b35e-host\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-host\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-systemd\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-tuned\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179661 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-daemon-config\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysctl-d\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179704 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-system-cni-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-cni-multus\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179745 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-host\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.179898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-cni-multus\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-registration-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzv8n\" (UniqueName: \"kubernetes.io/projected/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-kube-api-access-dzv8n\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-system-cni-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-registration-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-run-ovn-kubernetes\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/abb55075-ce71-45a8-8ef8-400976104389-ovn-node-metrics-cert\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-run-ovn-kubernetes\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-ovnkube-script-lib\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-netns\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88a73d58-a99e-49c1-9821-a06593a8b35e-serviceca\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-cni-netd\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-netns\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31fbbb71-5394-4f60-8de2-cc5dc970ab35-tmp-dir\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180250 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:20.180728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-system-cni-dir\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.179884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/880ca20b-7732-4709-9f0a-9013465ca003-iptables-alerter-script\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-etc-selinux\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-cni-netd\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-sys-fs\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-cni-bin\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-system-cni-dir\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjzf2\" (UniqueName: \"kubernetes.io/projected/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-kube-api-access-rjzf2\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-socket-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-kubernetes\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180610 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-var-lib-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-os-release\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-cni-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-ovn\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-tmp\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88a73d58-a99e-49c1-9821-a06593a8b35e-serviceca\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.181484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-kubelet\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-run-netns\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-systemd\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-socket-dir\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-systemd\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-cni-bin\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-sys-fs\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-kubernetes\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-os-release\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-ovn\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtkj\" (UniqueName: \"kubernetes.io/projected/9835b143-e40c-4455-9924-5824b457a60a-kube-api-access-bbtkj\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.180974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-cni-binary-copy\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/239caad5-0402-47f0-8e15-7f5d02343638-os-release\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m424d\" (UniqueName: \"kubernetes.io/projected/239caad5-0402-47f0-8e15-7f5d02343638-kube-api-access-m424d\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-cni-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-cnibin\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-kubelet\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-run-netns\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-kubelet\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-hostroot\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-run-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-hostroot\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-conf-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-conf-dir\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181324 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-os-release\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-cnibin\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31fbbb71-5394-4f60-8de2-cc5dc970ab35-tmp-dir\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-var-lib-kubelet\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9835b143-e40c-4455-9924-5824b457a60a-etc-selinux\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-var-lib-openvswitch\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181691 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/880ca20b-7732-4709-9f0a-9013465ca003-iptables-alerter-script\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-lib-modules\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-socket-dir-parent\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.182970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31fbbb71-5394-4f60-8de2-cc5dc970ab35-hosts-file\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-multus-socket-dir-parent\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea-konnectivity-ca\") pod \"konnectivity-agent-ln7wc\" (UID: \"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea\") " pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysconfig\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-lib-modules\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-env-overrides\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.181989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz5wn\" (UniqueName: \"kubernetes.io/projected/88a73d58-a99e-49c1-9821-a06593a8b35e-kube-api-access-wz5wn\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-sys\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-slash\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-cni-bin\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-ovnkube-script-lib\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-ovnkube-config\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l88d6\" (UniqueName: \"kubernetes.io/projected/abb55075-ce71-45a8-8ef8-400976104389-kube-api-access-l88d6\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea-agent-certs\") pod \"konnectivity-agent-ln7wc\" (UID: \"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea\") " pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-var-lib-kubelet\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-log-socket\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.183767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-k8s-cni-cncf-io\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rd5pw\" (UniqueName: \"kubernetes.io/projected/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-kube-api-access-rd5pw\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-systemd-units\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-env-overrides\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-cni-binary-copy\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-slash\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182552 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-sys\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-host-cni-bin\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31fbbb71-5394-4f60-8de2-cc5dc970ab35-hosts-file\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea-konnectivity-ca\") pod \"konnectivity-agent-ln7wc\" (UID: \"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea\") " pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-sysconfig\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-var-lib-kubelet\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.182982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-systemd-units\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.183124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-host-run-k8s-cni-cncf-io\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.183172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/abb55075-ce71-45a8-8ef8-400976104389-log-socket\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.183403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/abb55075-ce71-45a8-8ef8-400976104389-ovnkube-config\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.184432 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.183628 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/239caad5-0402-47f0-8e15-7f5d02343638-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.185058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.183898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-etc-tuned\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.185058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.183985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-tmp\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.185191 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.185142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/abb55075-ce71-45a8-8ef8-400976104389-ovn-node-metrics-cert\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.185811 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.185786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea-agent-certs\") pod \"konnectivity-agent-ln7wc\" (UID: \"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea\") " pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.190951 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.190918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvwr\" (UniqueName: \"kubernetes.io/projected/880ca20b-7732-4709-9f0a-9013465ca003-kube-api-access-xsvwr\") pod \"iptables-alerter-m7gwx\" (UID: \"880ca20b-7732-4709-9f0a-9013465ca003\") " pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.192977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.192950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfw9\" (UniqueName: \"kubernetes.io/projected/31fbbb71-5394-4f60-8de2-cc5dc970ab35-kube-api-access-2xfw9\") pod \"node-resolver-cstvp\" (UID: \"31fbbb71-5394-4f60-8de2-cc5dc970ab35\") " pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.193102 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.193081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzv8n\" (UniqueName: \"kubernetes.io/projected/e7c79dc8-944a-4f71-8545-a3c37de6cdc2-kube-api-access-dzv8n\") pod \"tuned-5pbst\" (UID: \"e7c79dc8-944a-4f71-8545-a3c37de6cdc2\") " pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.193832 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.193402 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:20.193832 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.193426 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:20.193832 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.193432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd5pw\" (UniqueName: \"kubernetes.io/projected/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-kube-api-access-rd5pw\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.193832 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.193457 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jg9bj for pod openshift-network-diagnostics/network-check-target-tzpnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:20.193832 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.193543 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj podName:89ab8923-5f3a-4535-9d3f-e72f739904d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.693525184 +0000 UTC m=+3.099366628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jg9bj" (UniqueName: "kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj") pod "network-check-target-tzpnt" (UID: "89ab8923-5f3a-4535-9d3f-e72f739904d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:20.193832 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.193777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtkj\" (UniqueName: \"kubernetes.io/projected/9835b143-e40c-4455-9924-5824b457a60a-kube-api-access-bbtkj\") pod \"aws-ebs-csi-driver-node-q8ldw\" (UID: \"9835b143-e40c-4455-9924-5824b457a60a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.194196 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.194055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz5wn\" (UniqueName: \"kubernetes.io/projected/88a73d58-a99e-49c1-9821-a06593a8b35e-kube-api-access-wz5wn\") pod \"node-ca-9crbq\" (UID: \"88a73d58-a99e-49c1-9821-a06593a8b35e\") " pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.195476 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.195452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m424d\" (UniqueName: \"kubernetes.io/projected/239caad5-0402-47f0-8e15-7f5d02343638-kube-api-access-m424d\") pod \"multus-additional-cni-plugins-cph25\" (UID: \"239caad5-0402-47f0-8e15-7f5d02343638\") " pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.195669 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.195648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l88d6\" (UniqueName: \"kubernetes.io/projected/abb55075-ce71-45a8-8ef8-400976104389-kube-api-access-l88d6\") pod \"ovnkube-node-rz2jk\" (UID: \"abb55075-ce71-45a8-8ef8-400976104389\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.195904 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.195867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjzf2\" (UniqueName: \"kubernetes.io/projected/804bc0fd-469c-45c8-8ece-8dbbfdb0705e-kube-api-access-rjzf2\") pod \"multus-mp2nj\" (UID: \"804bc0fd-469c-45c8-8ece-8dbbfdb0705e\") " pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.366419 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.366330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:20.371266 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.371234 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cstvp" Apr 24 21:16:20.380050 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.380021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9crbq" Apr 24 21:16:20.385024 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.384977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cph25" Apr 24 21:16:20.391769 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.391726 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mp2nj" Apr 24 21:16:20.398798 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.398770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7gwx" Apr 24 21:16:20.405497 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.405468 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:20.412250 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.412224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" Apr 24 21:16:20.417008 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.416968 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5pbst" Apr 24 21:16:20.687034 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.686939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:20.687186 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.687113 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:20.687251 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.687191 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:21.687170384 +0000 UTC m=+4.093011827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:20.787641 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:20.787599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:20.787852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.787754 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:20.787852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.787774 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:20.787852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.787803 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jg9bj for pod openshift-network-diagnostics/network-check-target-tzpnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:20.788006 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:20.787861 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj podName:89ab8923-5f3a-4535-9d3f-e72f739904d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:21.787846563 +0000 UTC m=+4.193688012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jg9bj" (UniqueName: "kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj") pod "network-check-target-tzpnt" (UID: "89ab8923-5f3a-4535-9d3f-e72f739904d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:20.812330 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:20.812297 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9835b143_e40c_4455_9924_5824b457a60a.slice/crio-783588d49fae65a0c2c9cfd321176606474c7d1098ee29dd951017516b308f6f WatchSource:0}: Error finding container 783588d49fae65a0c2c9cfd321176606474c7d1098ee29dd951017516b308f6f: Status 404 returned error can't find the container with id 783588d49fae65a0c2c9cfd321176606474c7d1098ee29dd951017516b308f6f Apr 24 21:16:20.818313 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:20.818287 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb55075_ce71_45a8_8ef8_400976104389.slice/crio-41eb5bff22b8b80f50c9bb325775271fc43e76611b07271b78ef3fdb0304f607 WatchSource:0}: Error finding container 41eb5bff22b8b80f50c9bb325775271fc43e76611b07271b78ef3fdb0304f607: Status 404 returned error can't find the container with id 41eb5bff22b8b80f50c9bb325775271fc43e76611b07271b78ef3fdb0304f607 Apr 24 21:16:20.819679 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:20.819653 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7660fe75_2b1f_42c3_8bcf_b3fcc97a90ea.slice/crio-1f5bb3f54a70c1c91fe1e2149f0dcfaf2b276a4015ce194facfecb081fee4dec WatchSource:0}: Error finding container 1f5bb3f54a70c1c91fe1e2149f0dcfaf2b276a4015ce194facfecb081fee4dec: Status 404 returned error can't find the container with id 1f5bb3f54a70c1c91fe1e2149f0dcfaf2b276a4015ce194facfecb081fee4dec Apr 24 21:16:20.820417 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:20.820394 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239caad5_0402_47f0_8e15_7f5d02343638.slice/crio-0678030de14e72136ae71b8e323b66b5f6b41f5cfd4e3bdb49f448fe7b9cf0b1 WatchSource:0}: Error finding container 0678030de14e72136ae71b8e323b66b5f6b41f5cfd4e3bdb49f448fe7b9cf0b1: Status 404 returned error can't find the container with id 0678030de14e72136ae71b8e323b66b5f6b41f5cfd4e3bdb49f448fe7b9cf0b1 Apr 24 21:16:20.821881 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:20.821793 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804bc0fd_469c_45c8_8ece_8dbbfdb0705e.slice/crio-ed86836e971e952801f0f04bc08183035ca1a2fce96aededed37397029962ca5 WatchSource:0}: Error finding container ed86836e971e952801f0f04bc08183035ca1a2fce96aededed37397029962ca5: Status 404 returned error can't find the container with id ed86836e971e952801f0f04bc08183035ca1a2fce96aededed37397029962ca5 Apr 24 21:16:20.822682 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:20.822628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod880ca20b_7732_4709_9f0a_9013465ca003.slice/crio-90f85388510f6682555fdfec07c35a41351083657bc5b838127ae9909ed1d3b3 WatchSource:0}: Error finding container 90f85388510f6682555fdfec07c35a41351083657bc5b838127ae9909ed1d3b3: Status 404 returned error can't find the container with id 90f85388510f6682555fdfec07c35a41351083657bc5b838127ae9909ed1d3b3 Apr 24 21:16:21.113698 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.113430 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:19 +0000 UTC" deadline="2027-12-30 22:34:45.252732786 +0000 UTC" Apr 24 21:16:21.113698 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.113668 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14761h18m24.139069468s" Apr 24 21:16:21.207160 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.206675 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:21.207160 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.206809 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:21.223964 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.223814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" event={"ID":"9835b143-e40c-4455-9924-5824b457a60a","Type":"ContainerStarted","Data":"783588d49fae65a0c2c9cfd321176606474c7d1098ee29dd951017516b308f6f"} Apr 24 21:16:21.230895 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.230820 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cstvp" event={"ID":"31fbbb71-5394-4f60-8de2-cc5dc970ab35","Type":"ContainerStarted","Data":"efab7011cfbfbb07864b3b0e22c3583893aff5d4ea33f43bc1e1cd31fdb8e7f8"} Apr 24 21:16:21.242343 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.242206 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9crbq" event={"ID":"88a73d58-a99e-49c1-9821-a06593a8b35e","Type":"ContainerStarted","Data":"8bb5638bcadfa31ac4d15b1497b26a95bba4de8b5f19efe48a09db19bfb6bb8a"} Apr 24 21:16:21.248200 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.247584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7gwx" event={"ID":"880ca20b-7732-4709-9f0a-9013465ca003","Type":"ContainerStarted","Data":"90f85388510f6682555fdfec07c35a41351083657bc5b838127ae9909ed1d3b3"} Apr 24 21:16:21.256985 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.256609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" event={"ID":"46129ef0396a1fbd001318ae09f161a9","Type":"ContainerStarted","Data":"427fc79aec22a1926cb791a0deb46ba1f5c4ac48b4f679ff17a5966274abd5f2"} Apr 24 21:16:21.259400 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.259346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5pbst" event={"ID":"e7c79dc8-944a-4f71-8545-a3c37de6cdc2","Type":"ContainerStarted","Data":"2b5d467df1b0fa44ba7af1eba5535e0e861dd5627e7b67c4a3d4213f88661688"} Apr 24 21:16:21.263104 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.263046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mp2nj" event={"ID":"804bc0fd-469c-45c8-8ece-8dbbfdb0705e","Type":"ContainerStarted","Data":"ed86836e971e952801f0f04bc08183035ca1a2fce96aededed37397029962ca5"} Apr 24 21:16:21.265029 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.264932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerStarted","Data":"0678030de14e72136ae71b8e323b66b5f6b41f5cfd4e3bdb49f448fe7b9cf0b1"} Apr 24 21:16:21.269947 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.269286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ln7wc" event={"ID":"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea","Type":"ContainerStarted","Data":"1f5bb3f54a70c1c91fe1e2149f0dcfaf2b276a4015ce194facfecb081fee4dec"} Apr 24 21:16:21.277032 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.276830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"41eb5bff22b8b80f50c9bb325775271fc43e76611b07271b78ef3fdb0304f607"} Apr 24 21:16:21.695852 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.695180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:21.695852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.695380 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:21.695852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.695446 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:23.695426507 +0000 UTC m=+6.101267959 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:21.796681 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:21.796624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:21.796852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.796789 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:21.796852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.796811 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:21.796852 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.796824 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jg9bj for pod openshift-network-diagnostics/network-check-target-tzpnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:21.797032 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:21.796885 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj podName:89ab8923-5f3a-4535-9d3f-e72f739904d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:23.796864363 +0000 UTC m=+6.202705799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jg9bj" (UniqueName: "kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj") pod "network-check-target-tzpnt" (UID: "89ab8923-5f3a-4535-9d3f-e72f739904d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:22.208025 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:22.207507 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:22.208025 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:22.207628 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:22.295477 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:22.294630 2573 generic.go:358] "Generic (PLEG): container finished" podID="fd8ac9133f3c42502b0cb4b65d236ac9" containerID="a2a5d223bf0e7c3552dd0ff39ddcb3c83d560c790cbcd681de7f2ef8a574a1dd" exitCode=0 Apr 24 21:16:22.295477 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:22.295419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" event={"ID":"fd8ac9133f3c42502b0cb4b65d236ac9","Type":"ContainerDied","Data":"a2a5d223bf0e7c3552dd0ff39ddcb3c83d560c790cbcd681de7f2ef8a574a1dd"} Apr 24 21:16:22.312548 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:22.312494 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-21.ec2.internal" podStartSLOduration=3.312472682 podStartE2EDuration="3.312472682s" podCreationTimestamp="2026-04-24 21:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:21.271711289 +0000 UTC m=+3.677552741" watchObservedRunningTime="2026-04-24 21:16:22.312472682 +0000 UTC m=+4.718314135" Apr 24 21:16:23.046840 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.046807 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-sbxgn"] Apr 24 21:16:23.049210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.049183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.049348 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.049274 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:23.106505 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.106463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38a98561-29f3-47af-9151-b0d0095b287e-kubelet-config\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.106684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.106538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38a98561-29f3-47af-9151-b0d0095b287e-dbus\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.106684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.106562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.206728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.206687 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:23.206903 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.206844 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:23.207433 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.207407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38a98561-29f3-47af-9151-b0d0095b287e-kubelet-config\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.207531 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.207469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38a98561-29f3-47af-9151-b0d0095b287e-dbus\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.207531 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.207485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38a98561-29f3-47af-9151-b0d0095b287e-kubelet-config\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.207531 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.207513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.207712 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.207626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38a98561-29f3-47af-9151-b0d0095b287e-dbus\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.207712 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.207631 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:23.207712 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.207693 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret podName:38a98561-29f3-47af-9151-b0d0095b287e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:23.707673431 +0000 UTC m=+6.113514864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret") pod "global-pull-secret-syncer-sbxgn" (UID: "38a98561-29f3-47af-9151-b0d0095b287e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:23.302769 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.302686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" event={"ID":"fd8ac9133f3c42502b0cb4b65d236ac9","Type":"ContainerStarted","Data":"98a1677f3806b1c42ea3c9c92b06d4cb2157123a3f39f08b7a445e768684d843"} Apr 24 21:16:23.713656 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.712848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:23.713656 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.712934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:23.713656 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.713069 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:23.713656 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.713128 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret podName:38a98561-29f3-47af-9151-b0d0095b287e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:24.713111073 +0000 UTC m=+7.118952516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret") pod "global-pull-secret-syncer-sbxgn" (UID: "38a98561-29f3-47af-9151-b0d0095b287e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:23.713656 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.713528 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:23.713656 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.713588 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:27.713571966 +0000 UTC m=+10.119413403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:23.813827 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:23.813790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:23.814033 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.814015 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:23.814116 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.814043 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:23.814116 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.814056 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jg9bj for pod openshift-network-diagnostics/network-check-target-tzpnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:23.814228 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:23.814123 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj podName:89ab8923-5f3a-4535-9d3f-e72f739904d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:27.814103606 +0000 UTC m=+10.219945041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jg9bj" (UniqueName: "kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj") pod "network-check-target-tzpnt" (UID: "89ab8923-5f3a-4535-9d3f-e72f739904d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:24.206947 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:24.206910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:24.207117 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:24.207053 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:24.208978 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:24.208950 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:24.209123 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:24.209076 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:24.722473 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:24.722003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:24.722473 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:24.722139 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:24.722473 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:24.722189 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret podName:38a98561-29f3-47af-9151-b0d0095b287e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:26.722175099 +0000 UTC m=+9.128016541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret") pod "global-pull-secret-syncer-sbxgn" (UID: "38a98561-29f3-47af-9151-b0d0095b287e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:25.206499 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:25.206419 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:25.206654 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:25.206569 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:26.206828 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:26.206785 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:26.207287 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:26.206911 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:26.207287 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:26.206954 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:26.207287 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:26.207061 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:26.739164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:26.738612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:26.739164 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:26.738757 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:26.739164 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:26.738816 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret podName:38a98561-29f3-47af-9151-b0d0095b287e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:30.738799052 +0000 UTC m=+13.144640495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret") pod "global-pull-secret-syncer-sbxgn" (UID: "38a98561-29f3-47af-9151-b0d0095b287e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:27.206327 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:27.206090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:27.206327 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.206237 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:27.747717 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:27.747673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:27.748215 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.747907 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:27.748215 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.747984 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.747964562 +0000 UTC m=+18.153805999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:27.848997 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:27.848930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:27.849188 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.849124 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:27.849188 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.849151 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:27.849188 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.849164 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jg9bj for pod openshift-network-diagnostics/network-check-target-tzpnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:27.849340 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:27.849222 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj podName:89ab8923-5f3a-4535-9d3f-e72f739904d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.849202657 +0000 UTC m=+18.255044087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jg9bj" (UniqueName: "kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj") pod "network-check-target-tzpnt" (UID: "89ab8923-5f3a-4535-9d3f-e72f739904d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:28.207760 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:28.207710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:28.207966 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:28.207865 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:28.207966 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:28.207952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:28.208096 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:28.208072 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:29.206621 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:29.206580 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:29.207066 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:29.206755 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:30.206303 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:30.206262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:30.206510 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:30.206279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:30.206510 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:30.206398 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:30.206510 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:30.206458 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:30.773635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:30.773604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:30.774111 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:30.773747 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:30.774111 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:30.773801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret podName:38a98561-29f3-47af-9151-b0d0095b287e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:38.773787755 +0000 UTC m=+21.179629184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret") pod "global-pull-secret-syncer-sbxgn" (UID: "38a98561-29f3-47af-9151-b0d0095b287e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:31.206390 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:31.206293 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:31.206557 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:31.206447 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:32.206263 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:32.206222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:32.206749 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:32.206222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:32.206749 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:32.206327 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:32.206749 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:32.206473 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:33.206722 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:33.206690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:33.207075 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:33.206797 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:34.206238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:34.206201 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:34.206431 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:34.206339 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:34.206431 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:34.206400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:34.206613 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:34.206514 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:35.207004 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:35.206970 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:35.207529 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.207091 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:35.807830 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:35.807788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:35.808114 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.807964 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:35.808114 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.808031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.808014831 +0000 UTC m=+34.213856266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:35.908694 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:35.908655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:35.908876 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.908851 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:35.908943 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.908882 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:35.908943 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.908897 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jg9bj for pod openshift-network-diagnostics/network-check-target-tzpnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:35.909029 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:35.908962 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj podName:89ab8923-5f3a-4535-9d3f-e72f739904d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.9089423 +0000 UTC m=+34.314783743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jg9bj" (UniqueName: "kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj") pod "network-check-target-tzpnt" (UID: "89ab8923-5f3a-4535-9d3f-e72f739904d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:36.206190 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:36.206086 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:36.206376 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:36.206235 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:36.206376 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:36.206231 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:36.206376 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:36.206320 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:37.206372 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:37.206320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:37.206800 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:37.206486 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:38.207393 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.207190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:38.207941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.207277 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:38.207941 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:38.207486 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:38.207941 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:38.207582 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:38.328805 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.328711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cstvp" event={"ID":"31fbbb71-5394-4f60-8de2-cc5dc970ab35","Type":"ContainerStarted","Data":"32f9b140b4f896aa6fb9221b8623c4259802453e13f61d23310581dd2cb9f8f0"} Apr 24 21:16:38.330036 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.330005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9crbq" event={"ID":"88a73d58-a99e-49c1-9821-a06593a8b35e","Type":"ContainerStarted","Data":"1903f3b522db4275a9f99ab9f3a8c2ae795006c5f81c07ffe2f2d69c6efa4daa"} Apr 24 21:16:38.331206 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.331173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5pbst" event={"ID":"e7c79dc8-944a-4f71-8545-a3c37de6cdc2","Type":"ContainerStarted","Data":"2dc1044b140009657ce85853f266735cedb51d46adb89679478d318e584856e6"} Apr 24 21:16:38.332477 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.332451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mp2nj" event={"ID":"804bc0fd-469c-45c8-8ece-8dbbfdb0705e","Type":"ContainerStarted","Data":"83570b3484aba6389f11e7decbee1f757f7566c6ba1e0b3ef3be2de2311b6bd1"} Apr 24 21:16:38.333825 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.333803 2573 generic.go:358] "Generic (PLEG): container finished" podID="239caad5-0402-47f0-8e15-7f5d02343638" containerID="5708b2adf578db9193adca794c6f614d2b7b6d5a22a2e8863c5edc1eb01737ec" exitCode=0 Apr 24 21:16:38.333924 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.333859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerDied","Data":"5708b2adf578db9193adca794c6f614d2b7b6d5a22a2e8863c5edc1eb01737ec"} Apr 24 21:16:38.335253 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.335224 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ln7wc" event={"ID":"7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea","Type":"ContainerStarted","Data":"7510e3252f668a2f86a0cce70bea495b828066ead8097ed17fc4a34166ff0594"} Apr 24 21:16:38.337769 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.337750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" event={"ID":"9835b143-e40c-4455-9924-5824b457a60a","Type":"ContainerStarted","Data":"bc879aaef862824a77813a9735b2cb25e3aa2f3515cda095126f3c0bbc853daf"} Apr 24 21:16:38.341686 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.341650 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cstvp" podStartSLOduration=3.327641301 podStartE2EDuration="20.341640555s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.827497154 +0000 UTC m=+3.233338592" lastFinishedPulling="2026-04-24 21:16:37.841496397 +0000 UTC m=+20.247337846" observedRunningTime="2026-04-24 21:16:38.340888945 +0000 UTC m=+20.746730396" watchObservedRunningTime="2026-04-24 21:16:38.341640555 +0000 UTC m=+20.747482005" Apr 24 21:16:38.341777 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.341742 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-21.ec2.internal" podStartSLOduration=19.341738552 podStartE2EDuration="19.341738552s" podCreationTimestamp="2026-04-24 21:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:23.325998571 +0000 UTC m=+5.731840023" watchObservedRunningTime="2026-04-24 21:16:38.341738552 +0000 UTC m=+20.747579999" Apr 24 21:16:38.352883 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.352836 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9crbq" podStartSLOduration=3.337562331 podStartE2EDuration="20.352824306s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.826051838 +0000 UTC m=+3.231893276" lastFinishedPulling="2026-04-24 21:16:37.841313809 +0000 UTC m=+20.247155251" observedRunningTime="2026-04-24 21:16:38.352547541 +0000 UTC m=+20.758388991" watchObservedRunningTime="2026-04-24 21:16:38.352824306 +0000 UTC m=+20.758665756" Apr 24 21:16:38.386859 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.386799 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ln7wc" podStartSLOduration=11.514393397 podStartE2EDuration="20.386782898s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.821518435 +0000 UTC m=+3.227359868" lastFinishedPulling="2026-04-24 21:16:29.693907925 +0000 UTC m=+12.099749369" observedRunningTime="2026-04-24 21:16:38.386667795 +0000 UTC m=+20.792509247" watchObservedRunningTime="2026-04-24 21:16:38.386782898 +0000 UTC m=+20.792624353" Apr 24 21:16:38.418631 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.418520 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5pbst" podStartSLOduration=3.40372234 podStartE2EDuration="20.418505891s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.828667545 +0000 UTC m=+3.234508988" lastFinishedPulling="2026-04-24 21:16:37.843451102 +0000 UTC m=+20.249292539" observedRunningTime="2026-04-24 21:16:38.418338749 +0000 UTC m=+20.824180199" watchObservedRunningTime="2026-04-24 21:16:38.418505891 +0000 UTC m=+20.824347342" Apr 24 21:16:38.418631 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.418606 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mp2nj" podStartSLOduration=3.388904185 podStartE2EDuration="20.418602586s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.823758085 +0000 UTC m=+3.229599521" lastFinishedPulling="2026-04-24 21:16:37.853456489 +0000 UTC m=+20.259297922" observedRunningTime="2026-04-24 21:16:38.404163415 +0000 UTC m=+20.810004865" watchObservedRunningTime="2026-04-24 21:16:38.418602586 +0000 UTC m=+20.824444036" Apr 24 21:16:38.833088 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:38.832822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:38.833281 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:38.832983 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:38.833281 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:38.833241 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret podName:38a98561-29f3-47af-9151-b0d0095b287e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.833216873 +0000 UTC m=+37.239058325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret") pod "global-pull-secret-syncer-sbxgn" (UID: "38a98561-29f3-47af-9151-b0d0095b287e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:39.143042 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.143015 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:39.146048 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.145955 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:39.143033771Z","UUID":"a2e3486b-3d26-4f9b-82ff-6fb0a89abb1f","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:39.148373 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.148338 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:39.148469 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.148384 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:39.206099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.206060 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:39.206260 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:39.206214 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:39.343452 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.343416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"65cfc3e5f747add28cda5e5e2cdffb805996ff8f21a10ee8e5b5c905eba6fa44"} Apr 24 21:16:39.344211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.343459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"2f8a20d96120968a9cf0be663bc49b64ed5490c598ce6a7f74c3c14b377dd1b9"} Apr 24 21:16:39.344211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.343475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"fc387bb3b9f278e3dc7136a5211ed83395417b3966ee4ecc9a1790676710b283"} Apr 24 21:16:39.344211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.343487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"016b7eaa350ddf17e40a0636c83c40366d7cf5deba28db9de41da6f886031607"} Apr 24 21:16:39.344211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.343500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"e45e1ac7cf3e6e2152c0cb9a3a873a69cab68ace45d14cb01ac3e2a4baf34f7e"} Apr 24 21:16:39.344211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.343512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"1ad08a238163b31d464dc1c3ca412ac12476838bd6062c65415a17a037d3f7c9"} Apr 24 21:16:39.346310 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:39.346263 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" event={"ID":"9835b143-e40c-4455-9924-5824b457a60a","Type":"ContainerStarted","Data":"463a8723ce47897d950b13c538bee92387de6e440aef0aa9d334354ef2731f18"} Apr 24 21:16:40.206419 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:40.206383 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:40.206666 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:40.206431 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:40.206666 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:40.206519 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:40.206666 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:40.206642 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:40.350534 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:40.350434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" event={"ID":"9835b143-e40c-4455-9924-5824b457a60a","Type":"ContainerStarted","Data":"696ab13cc19b2a1d3c28a4dd345172528f670e75b66a8b4132abea674b4235bf"} Apr 24 21:16:40.351901 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:40.351867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7gwx" event={"ID":"880ca20b-7732-4709-9f0a-9013465ca003","Type":"ContainerStarted","Data":"ecf221b9e16cea4e99acf45447fb7da25110d2d4a515b61fe4124772f74b5c24"} Apr 24 21:16:40.379382 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:40.379308 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8ldw" podStartSLOduration=3.143156922 podStartE2EDuration="22.379288891s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.814712722 +0000 UTC m=+3.220554152" lastFinishedPulling="2026-04-24 21:16:40.050844678 +0000 UTC m=+22.456686121" observedRunningTime="2026-04-24 21:16:40.379275734 +0000 UTC m=+22.785117197" watchObservedRunningTime="2026-04-24 21:16:40.379288891 +0000 UTC m=+22.785130343" Apr 24 21:16:40.393716 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:40.393655 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m7gwx" podStartSLOduration=5.714975321 podStartE2EDuration="22.393640745s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.824882638 +0000 UTC m=+3.230724068" lastFinishedPulling="2026-04-24 21:16:37.503548059 +0000 UTC m=+19.909389492" observedRunningTime="2026-04-24 21:16:40.39315591 +0000 UTC m=+22.798997361" watchObservedRunningTime="2026-04-24 21:16:40.393640745 +0000 UTC m=+22.799482195" Apr 24 21:16:41.206280 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:41.206242 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:41.206477 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:41.206387 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:41.591106 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:41.591047 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:41.591937 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:41.591915 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:42.207152 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:42.206961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:42.207324 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:42.206971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:42.207324 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:42.207239 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:42.210611 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:42.207804 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:42.358650 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:42.358612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"3a5fbd88cf5aebb650b78a2a4b242a876f026e6b5c9ab3db5dc6a6937a5819b0"} Apr 24 21:16:43.206563 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:43.206532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:43.207073 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:43.206637 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:43.307812 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:43.307777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:43.308511 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:43.308493 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ln7wc" Apr 24 21:16:43.361411 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:43.361375 2573 generic.go:358] "Generic (PLEG): container finished" podID="239caad5-0402-47f0-8e15-7f5d02343638" containerID="1232f949bc184fdd9dd4473d9dc5677e43e8ba85e625dc6b58494b3dcd6debbc" exitCode=0 Apr 24 21:16:43.361563 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:43.361444 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerDied","Data":"1232f949bc184fdd9dd4473d9dc5677e43e8ba85e625dc6b58494b3dcd6debbc"} Apr 24 21:16:44.206622 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:44.206596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:44.207089 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:44.206720 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:44.207089 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:44.206782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:44.207089 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:44.206894 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:44.366278 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:44.366243 2573 generic.go:358] "Generic (PLEG): container finished" podID="239caad5-0402-47f0-8e15-7f5d02343638" containerID="6dc982a6c8bf2834324f07340cd3dc56d0fcb46f8bb1b501ab62fe91b8c5a72e" exitCode=0 Apr 24 21:16:44.366406 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:44.366337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerDied","Data":"6dc982a6c8bf2834324f07340cd3dc56d0fcb46f8bb1b501ab62fe91b8c5a72e"} Apr 24 21:16:45.206978 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.206801 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:45.207301 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:45.207070 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:45.370325 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.370290 2573 generic.go:358] "Generic (PLEG): container finished" podID="239caad5-0402-47f0-8e15-7f5d02343638" containerID="53482802b7cd50737f2192929b7f2e087e9b30fa5d5b26526a5c150cbc361871" exitCode=0 Apr 24 21:16:45.370506 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.370390 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerDied","Data":"53482802b7cd50737f2192929b7f2e087e9b30fa5d5b26526a5c150cbc361871"} Apr 24 21:16:45.373503 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.373473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" event={"ID":"abb55075-ce71-45a8-8ef8-400976104389","Type":"ContainerStarted","Data":"4d3e4f5bed3cd127dc780e04653af573262daad616fc8997261aaf093e44d6df"} Apr 24 21:16:45.373803 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.373787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:45.373852 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.373813 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:45.388012 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.387985 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:45.431568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:45.431518 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" podStartSLOduration=10.056776907 podStartE2EDuration="27.431502957s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.820319479 +0000 UTC m=+3.226160916" lastFinishedPulling="2026-04-24 21:16:38.195045497 +0000 UTC m=+20.600886966" observedRunningTime="2026-04-24 21:16:45.430480947 +0000 UTC m=+27.836322397" watchObservedRunningTime="2026-04-24 21:16:45.431502957 +0000 UTC m=+27.837344407" Apr 24 21:16:46.206263 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.206219 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:46.206468 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.206272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:46.206468 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:46.206379 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:46.206468 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:46.206433 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:46.337101 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.337069 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sbxgn"] Apr 24 21:16:46.340913 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.340878 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tzpnt"] Apr 24 21:16:46.343932 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.343901 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h5m79"] Apr 24 21:16:46.344081 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.344022 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:46.344268 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:46.344240 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:46.375583 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.375499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:46.375783 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:46.375629 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:46.375908 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.375496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:46.376534 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:46.376504 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:46.376652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.376552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:46.395113 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:46.395042 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:16:48.207531 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:48.207304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:48.208019 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:48.207422 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:48.208019 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:48.207615 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:48.208019 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:48.207747 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:48.208019 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:48.207446 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:48.208019 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:48.207831 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:50.206058 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.206016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:50.206625 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.206077 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:50.206625 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:50.206167 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5m79" podUID="a6ad0fc1-fbd1-4133-8616-3b950995f8e4" Apr 24 21:16:50.206625 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:50.206236 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzpnt" podUID="89ab8923-5f3a-4535-9d3f-e72f739904d4" Apr 24 21:16:50.206625 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.206253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:50.206625 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:50.206334 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sbxgn" podUID="38a98561-29f3-47af-9151-b0d0095b287e" Apr 24 21:16:50.884745 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.884667 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-21.ec2.internal" event="NodeReady" Apr 24 21:16:50.884918 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.884810 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:50.924083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.924045 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-74fb58c7f4-9dgzg"] Apr 24 21:16:50.960984 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.960377 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6dcc796c9-ngqbf"] Apr 24 21:16:50.981181 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.981150 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp"] Apr 24 21:16:50.981386 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.981243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:50.981386 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.981280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:50.984607 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.984428 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-frq8p\"" Apr 24 21:16:50.984757 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.984606 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:16:50.985211 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.985137 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:16:50.985766 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.985453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:16:50.986120 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.985889 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:16:50.986226 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.986176 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:16:50.987052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.986284 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:16:50.987052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.986340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:16:50.987052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.986380 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jgvbr\"" Apr 24 21:16:50.987052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.986452 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:16:50.987761 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.987726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:16:50.993145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:50.993120 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:16:51.004493 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.004472 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9"] Apr 24 21:16:51.004661 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.004642 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.007700 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.007538 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:16:51.007700 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.007560 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:16:51.007942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.007849 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2p2ck\"" Apr 24 21:16:51.024590 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.024546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.024771 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.024605 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-default-certificate\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.024771 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.024642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-stats-auth\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.024771 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.024688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.024894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.024818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vncwv\" (UniqueName: \"kubernetes.io/projected/5f412e6f-9e0c-44f5-b798-012969c57865-kube-api-access-vncwv\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.026286 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.026259 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dbr6c"] Apr 24 21:16:51.026443 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.026425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" Apr 24 21:16:51.028914 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.028889 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.029169 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.029146 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-ttclj\"" Apr 24 21:16:51.029278 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.029194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.050553 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.050524 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx"] Apr 24 21:16:51.050727 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.050695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.055812 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.055787 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.057180 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.056923 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:16:51.057180 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.056999 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.057670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.057456 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-7gghz\"" Apr 24 21:16:51.057737 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.057694 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:16:51.061437 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.061419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:16:51.077674 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.077646 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv"] Apr 24 21:16:51.077821 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.077805 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.081627 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.081600 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.081751 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.081676 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mkvnq\"" Apr 24 21:16:51.081817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.081784 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:16:51.081817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.081793 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.096738 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.096713 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t99mx"] Apr 24 21:16:51.096866 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.096852 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.099259 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.099240 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-x5wbn\"" Apr 24 21:16:51.099412 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.099268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.099412 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.099248 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:16:51.099412 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.099269 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:16:51.099412 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.099244 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.125920 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.125887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-installation-pull-secrets\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.125920 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.125923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.125940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-trusted-ca\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.125976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.125994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-image-registry-private-configuration\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/36a65b6d-1c50-425c-911a-eb5c1059cd95-snapshots\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fpf\" (UniqueName: \"kubernetes.io/projected/36a65b6d-1c50-425c-911a-eb5c1059cd95-kube-api-access-46fpf\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.126107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6gk\" (UniqueName: \"kubernetes.io/projected/62a47369-6a4f-4ac0-ae3b-559fb4cadc0d-kube-api-access-qb6gk\") pod \"volume-data-source-validator-7c6cbb6c87-lh2r9\" (UID: \"62a47369-6a4f-4ac0-ae3b-559fb4cadc0d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-bound-sa-token\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126145 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2bn5\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-kube-api-access-m2bn5\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.126211 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.626196639 +0000 UTC m=+34.032038084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vncwv\" (UniqueName: \"kubernetes.io/projected/5f412e6f-9e0c-44f5-b798-012969c57865-kube-api-access-vncwv\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126253 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a65b6d-1c50-425c-911a-eb5c1059cd95-tmp\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a65b6d-1c50-425c-911a-eb5c1059cd95-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.126316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a65b6d-1c50-425c-911a-eb5c1059cd95-service-ca-bundle\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.126568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126334 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707d4a93-e9f1-4763-bb75-86589c7e8b18-ca-trust-extracted\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.126568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-default-certificate\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.126568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-stats-auth\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.126568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a65b6d-1c50-425c-911a-eb5c1059cd95-serving-cert\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.126769 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.126582 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:51.126769 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.126583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-certificates\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.126769 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.126639 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.626626909 +0000 UTC m=+34.032468338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : secret "router-metrics-certs-default" not found Apr 24 21:16:51.127764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.127744 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh"] Apr 24 21:16:51.127927 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.127907 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.130822 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.130799 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:16:51.130822 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.130816 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:16:51.130979 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.130865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-stats-auth\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.130979 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.130921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-default-certificate\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.131081 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.131039 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.131081 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.131063 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.131342 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.131325 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ps9tv\"" Apr 24 21:16:51.137242 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.137190 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:16:51.140696 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.140672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vncwv\" (UniqueName: \"kubernetes.io/projected/5f412e6f-9e0c-44f5-b798-012969c57865-kube-api-access-vncwv\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.148239 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.148218 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk"] Apr 24 21:16:51.148390 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.148374 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.152967 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.152933 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:16:51.153210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.153192 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.153311 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.153291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-wt8f6\"" Apr 24 21:16:51.153515 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.153496 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:16:51.153622 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.153604 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.170080 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.170052 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp"] Apr 24 21:16:51.170080 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.170077 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74fb58c7f4-9dgzg"] Apr 24 21:16:51.170080 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.170088 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb"] Apr 24 21:16:51.170277 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.170199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.172658 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.172637 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cvtf7\"" Apr 24 21:16:51.172781 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.172684 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.172781 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.172722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.173551 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.173532 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:16:51.173643 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.173572 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:16:51.192375 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192333 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9"] Apr 24 21:16:51.192375 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192393 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dbr6c"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192407 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192418 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192428 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t99mx"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192439 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6dcc796c9-ngqbf"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192452 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192464 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m4spp"] Apr 24 21:16:51.192652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.192468 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" Apr 24 21:16:51.195171 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.195139 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.195295 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.195276 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-6k4xl\"" Apr 24 21:16:51.196176 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.196151 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.209241 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.209217 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb"] Apr 24 21:16:51.209241 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.209242 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4spp"] Apr 24 21:16:51.209834 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.209257 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nvpkh"] Apr 24 21:16:51.209834 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.209369 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.213911 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.213871 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.213911 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.213889 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:51.214084 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.213951 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.214249 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.214223 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-znhx4\"" Apr 24 21:16:51.221717 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.221697 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nvpkh"] Apr 24 21:16:51.221848 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.221835 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.224049 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.224027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:51.224151 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.224031 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:51.224151 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.224034 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-86llq\"" Apr 24 21:16:51.227012 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.226990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c5dn\" (UniqueName: \"kubernetes.io/projected/f9353274-ce1e-479b-a277-0a36a39b6fb2-kube-api-access-6c5dn\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.227677 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2bn5\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-kube-api-access-m2bn5\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.227677 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a65b6d-1c50-425c-911a-eb5c1059cd95-serving-cert\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.227677 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a65b6d-1c50-425c-911a-eb5c1059cd95-tmp\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.227677 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a65b6d-1c50-425c-911a-eb5c1059cd95-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227696 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9353274-ce1e-479b-a277-0a36a39b6fb2-config\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9353274-ce1e-479b-a277-0a36a39b6fb2-serving-cert\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-certificates\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58v6s\" (UniqueName: \"kubernetes.io/projected/16c39428-4288-4a12-9c01-4c9d16b18faa-kube-api-access-58v6s\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16c39428-4288-4a12-9c01-4c9d16b18faa-serving-cert\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c39428-4288-4a12-9c01-4c9d16b18faa-config\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.228002 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.227953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/36a65b6d-1c50-425c-911a-eb5c1059cd95-snapshots\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.228456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46fpf\" (UniqueName: \"kubernetes.io/projected/36a65b6d-1c50-425c-911a-eb5c1059cd95-kube-api-access-46fpf\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.228456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.228456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a65b6d-1c50-425c-911a-eb5c1059cd95-tmp\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.228456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6gk\" (UniqueName: \"kubernetes.io/projected/62a47369-6a4f-4ac0-ae3b-559fb4cadc0d-kube-api-access-qb6gk\") pod \"volume-data-source-validator-7c6cbb6c87-lh2r9\" (UID: \"62a47369-6a4f-4ac0-ae3b-559fb4cadc0d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" Apr 24 21:16:51.228456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9353274-ce1e-479b-a277-0a36a39b6fb2-trusted-ca\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-image-registry-private-configuration\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-certificates\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a65b6d-1c50-425c-911a-eb5c1059cd95-service-ca-bundle\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707d4a93-e9f1-4763-bb75-86589c7e8b18-ca-trust-extracted\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-installation-pull-secrets\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228645 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a65b6d-1c50-425c-911a-eb5c1059cd95-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.228695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9sw\" (UniqueName: \"kubernetes.io/projected/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-kube-api-access-ch9sw\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/36a65b6d-1c50-425c-911a-eb5c1059cd95-snapshots\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.228761 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.228773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-trusted-ca\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.228778 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6dcc796c9-ngqbf: secret "image-registry-tls" not found Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.229015 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:51.229083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.229073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707d4a93-e9f1-4763-bb75-86589c7e8b18-ca-trust-extracted\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.229457 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.229094 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert podName:fe0af93f-e6da-459a-b345-6cf8c4bcff2f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.729074671 +0000 UTC m=+34.134916116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6jmdp" (UID: "fe0af93f-e6da-459a-b345-6cf8c4bcff2f") : secret "networking-console-plugin-cert" not found Apr 24 21:16:51.229457 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.229138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.229457 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.229172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgbv5\" (UniqueName: \"kubernetes.io/projected/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-kube-api-access-sgbv5\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.229457 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.229220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-bound-sa-token\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.229457 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.229288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a65b6d-1c50-425c-911a-eb5c1059cd95-service-ca-bundle\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.229457 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.229309 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls podName:707d4a93-e9f1-4763-bb75-86589c7e8b18 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.729295682 +0000 UTC m=+34.135137150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls") pod "image-registry-6dcc796c9-ngqbf" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18") : secret "image-registry-tls" not found Apr 24 21:16:51.229975 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.229954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.230159 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.230137 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-trusted-ca\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.230416 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.230400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a65b6d-1c50-425c-911a-eb5c1059cd95-serving-cert\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.231169 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.231149 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-image-registry-private-configuration\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.231382 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.231346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-installation-pull-secrets\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.237282 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.237264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2bn5\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-kube-api-access-m2bn5\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.239292 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.239273 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fpf\" (UniqueName: \"kubernetes.io/projected/36a65b6d-1c50-425c-911a-eb5c1059cd95-kube-api-access-46fpf\") pod \"insights-operator-585dfdc468-dbr6c\" (UID: \"36a65b6d-1c50-425c-911a-eb5c1059cd95\") " pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.239705 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.239674 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6gk\" (UniqueName: \"kubernetes.io/projected/62a47369-6a4f-4ac0-ae3b-559fb4cadc0d-kube-api-access-qb6gk\") pod \"volume-data-source-validator-7c6cbb6c87-lh2r9\" (UID: \"62a47369-6a4f-4ac0-ae3b-559fb4cadc0d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" Apr 24 21:16:51.239827 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.239766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-bound-sa-token\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.329970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.329932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5dn\" (UniqueName: \"kubernetes.io/projected/f9353274-ce1e-479b-a277-0a36a39b6fb2-kube-api-access-6c5dn\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.330133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.329990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2jj\" (UniqueName: \"kubernetes.io/projected/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-kube-api-access-hz2jj\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.330133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9353274-ce1e-479b-a277-0a36a39b6fb2-config\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.330133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9353274-ce1e-479b-a277-0a36a39b6fb2-serving-cert\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.330133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.330272 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmgh\" (UniqueName: \"kubernetes.io/projected/e7d157a6-5982-4a38-b8d0-15d88309963a-kube-api-access-7wmgh\") pod \"network-check-source-8894fc9bd-mctfb\" (UID: \"e7d157a6-5982-4a38-b8d0-15d88309963a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" Apr 24 21:16:51.330272 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87ss\" (UniqueName: \"kubernetes.io/projected/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-kube-api-access-m87ss\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.330272 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58v6s\" (UniqueName: \"kubernetes.io/projected/16c39428-4288-4a12-9c01-4c9d16b18faa-kube-api-access-58v6s\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.330272 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16c39428-4288-4a12-9c01-4c9d16b18faa-serving-cert\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.330272 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.330539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c39428-4288-4a12-9c01-4c9d16b18faa-config\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.330539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.330539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42e99775-4de5-4bed-b01a-a3218d41d996-config-volume\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.330539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9353274-ce1e-479b-a277-0a36a39b6fb2-trusted-ca\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.330539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330428 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/42e99775-4de5-4bed-b01a-a3218d41d996-tmp-dir\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.330539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.330813 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.330813 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330796 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.330908 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9sw\" (UniqueName: \"kubernetes.io/projected/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-kube-api-access-ch9sw\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.330908 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9353274-ce1e-479b-a277-0a36a39b6fb2-config\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.330908 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.330908 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf9mv\" (UniqueName: \"kubernetes.io/projected/42e99775-4de5-4bed-b01a-a3218d41d996-kube-api-access-xf9mv\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.331156 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.330940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgbv5\" (UniqueName: \"kubernetes.io/projected/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-kube-api-access-sgbv5\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.331156 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.330954 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:51.331156 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.331010 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls podName:0dfa190e-0f34-44c5-a71e-ed3a9f7939de nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.830993014 +0000 UTC m=+34.236834462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bcvkx" (UID: "0dfa190e-0f34-44c5-a71e-ed3a9f7939de") : secret "samples-operator-tls" not found Apr 24 21:16:51.331156 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.331031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.331156 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.331143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c39428-4288-4a12-9c01-4c9d16b18faa-config\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.331406 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.331305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9353274-ce1e-479b-a277-0a36a39b6fb2-trusted-ca\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.333059 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.333038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16c39428-4288-4a12-9c01-4c9d16b18faa-serving-cert\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.333160 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.333043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.333160 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.333096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9353274-ce1e-479b-a277-0a36a39b6fb2-serving-cert\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.337449 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.337430 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" Apr 24 21:16:51.339456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.339431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgbv5\" (UniqueName: \"kubernetes.io/projected/d1b2baba-a138-4778-ad36-d2c72cf4b2d6-kube-api-access-sgbv5\") pod \"kube-storage-version-migrator-operator-6769c5d45-2fhhh\" (UID: \"d1b2baba-a138-4778-ad36-d2c72cf4b2d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.339966 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.339947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5dn\" (UniqueName: \"kubernetes.io/projected/f9353274-ce1e-479b-a277-0a36a39b6fb2-kube-api-access-6c5dn\") pod \"console-operator-9d4b6777b-t99mx\" (UID: \"f9353274-ce1e-479b-a277-0a36a39b6fb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.340180 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.340159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58v6s\" (UniqueName: \"kubernetes.io/projected/16c39428-4288-4a12-9c01-4c9d16b18faa-kube-api-access-58v6s\") pod \"service-ca-operator-d6fc45fc5-w27kv\" (UID: \"16c39428-4288-4a12-9c01-4c9d16b18faa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.340967 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.340950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9sw\" (UniqueName: \"kubernetes.io/projected/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-kube-api-access-ch9sw\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.360851 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.360819 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" Apr 24 21:16:51.406010 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.405921 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" Apr 24 21:16:51.431925 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.431893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m87ss\" (UniqueName: \"kubernetes.io/projected/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-kube-api-access-m87ss\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.432070 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.431937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.432070 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.431969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42e99775-4de5-4bed-b01a-a3218d41d996-config-volume\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.432070 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/42e99775-4de5-4bed-b01a-a3218d41d996-tmp-dir\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.432070 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.432284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.432284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.432284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9mv\" (UniqueName: \"kubernetes.io/projected/42e99775-4de5-4bed-b01a-a3218d41d996-kube-api-access-xf9mv\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.432284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2jj\" (UniqueName: \"kubernetes.io/projected/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-kube-api-access-hz2jj\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.432492 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmgh\" (UniqueName: \"kubernetes.io/projected/e7d157a6-5982-4a38-b8d0-15d88309963a-kube-api-access-7wmgh\") pod \"network-check-source-8894fc9bd-mctfb\" (UID: \"e7d157a6-5982-4a38-b8d0-15d88309963a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" Apr 24 21:16:51.432492 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.432419 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:51.432492 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.432491 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls podName:42e99775-4de5-4bed-b01a-a3218d41d996 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.932471396 +0000 UTC m=+34.338312841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls") pod "dns-default-nvpkh" (UID: "42e99775-4de5-4bed-b01a-a3218d41d996") : secret "dns-default-metrics-tls" not found Apr 24 21:16:51.432639 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.432560 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:51.432639 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.432624 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert podName:78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.932609959 +0000 UTC m=+34.338451406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert") pod "ingress-canary-m4spp" (UID: "78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77") : secret "canary-serving-cert" not found Apr 24 21:16:51.432737 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.432685 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:51.432737 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.432720 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls podName:32f8c25b-fb1f-4a40-b2ee-4f7db45184f1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.932709583 +0000 UTC m=+34.338551017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47dvk" (UID: "32f8c25b-fb1f-4a40-b2ee-4f7db45184f1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:51.432834 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.432809 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/42e99775-4de5-4bed-b01a-a3218d41d996-tmp-dir\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.433179 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.433159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42e99775-4de5-4bed-b01a-a3218d41d996-config-volume\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.433248 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.433177 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.443554 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.443484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf9mv\" (UniqueName: \"kubernetes.io/projected/42e99775-4de5-4bed-b01a-a3218d41d996-kube-api-access-xf9mv\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.444670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.444641 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmgh\" (UniqueName: \"kubernetes.io/projected/e7d157a6-5982-4a38-b8d0-15d88309963a-kube-api-access-7wmgh\") pod \"network-check-source-8894fc9bd-mctfb\" (UID: \"e7d157a6-5982-4a38-b8d0-15d88309963a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" Apr 24 21:16:51.444796 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.444712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2jj\" (UniqueName: \"kubernetes.io/projected/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-kube-api-access-hz2jj\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.445150 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.445128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87ss\" (UniqueName: \"kubernetes.io/projected/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-kube-api-access-m87ss\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.448371 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.448326 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:16:51.456727 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.456705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" Apr 24 21:16:51.502322 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.501564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" Apr 24 21:16:51.635066 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.634281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.635066 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.634401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:51.635066 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.634589 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.634569969 +0000 UTC m=+35.040411404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:51.635066 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.634729 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:51.635066 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.634786 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.634768908 +0000 UTC m=+35.040610360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : secret "router-metrics-certs-default" not found Apr 24 21:16:51.640329 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.640273 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv"] Apr 24 21:16:51.652644 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:51.652606 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c39428_4288_4a12_9c01_4c9d16b18faa.slice/crio-2201509ec6fe1c9c9c1b3d96b7703067a9b1e08f254e05a383b7a6d158a7aaed WatchSource:0}: Error finding container 2201509ec6fe1c9c9c1b3d96b7703067a9b1e08f254e05a383b7a6d158a7aaed: Status 404 returned error can't find the container with id 2201509ec6fe1c9c9c1b3d96b7703067a9b1e08f254e05a383b7a6d158a7aaed Apr 24 21:16:51.663848 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.663822 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dbr6c"] Apr 24 21:16:51.668952 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.668906 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9"] Apr 24 21:16:51.673572 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:51.673474 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a65b6d_1c50_425c_911a_eb5c1059cd95.slice/crio-27e2aea47c65e373b81bbe1635fea6d295b7b1796aba7f43a875ded69b6831ce WatchSource:0}: Error finding container 27e2aea47c65e373b81bbe1635fea6d295b7b1796aba7f43a875ded69b6831ce: Status 404 returned error can't find the container with id 27e2aea47c65e373b81bbe1635fea6d295b7b1796aba7f43a875ded69b6831ce Apr 24 21:16:51.676795 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:51.676775 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a47369_6a4f_4ac0_ae3b_559fb4cadc0d.slice/crio-4f62df3b53d026f5f3dcff3f5dbca3183adccea7977d403d50401016aac33d66 WatchSource:0}: Error finding container 4f62df3b53d026f5f3dcff3f5dbca3183adccea7977d403d50401016aac33d66: Status 404 returned error can't find the container with id 4f62df3b53d026f5f3dcff3f5dbca3183adccea7977d403d50401016aac33d66 Apr 24 21:16:51.693050 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.693021 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh"] Apr 24 21:16:51.698004 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.697969 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t99mx"] Apr 24 21:16:51.700897 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:51.700870 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9353274_ce1e_479b_a277_0a36a39b6fb2.slice/crio-4cafb64122d3add2eb6bb8c421b8ac616325bd28479129859f9d42a55acc850d WatchSource:0}: Error finding container 4cafb64122d3add2eb6bb8c421b8ac616325bd28479129859f9d42a55acc850d: Status 404 returned error can't find the container with id 4cafb64122d3add2eb6bb8c421b8ac616325bd28479129859f9d42a55acc850d Apr 24 21:16:51.720421 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.720341 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb"] Apr 24 21:16:51.735040 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.735013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:51.735174 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.735159 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:51.735208 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.735188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:51.735244 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.735220 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert podName:fe0af93f-e6da-459a-b345-6cf8c4bcff2f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.735200432 +0000 UTC m=+35.141041869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6jmdp" (UID: "fe0af93f-e6da-459a-b345-6cf8c4bcff2f") : secret "networking-console-plugin-cert" not found Apr 24 21:16:51.735284 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.735265 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:51.735284 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.735277 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6dcc796c9-ngqbf: secret "image-registry-tls" not found Apr 24 21:16:51.735342 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.735316 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls podName:707d4a93-e9f1-4763-bb75-86589c7e8b18 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.735305773 +0000 UTC m=+35.141147215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls") pod "image-registry-6dcc796c9-ngqbf" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18") : secret "image-registry-tls" not found Apr 24 21:16:51.835785 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.835757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:51.835915 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.835850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:51.835971 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.835911 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:51.836018 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.835979 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls podName:0dfa190e-0f34-44c5-a71e-ed3a9f7939de nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.835964088 +0000 UTC m=+35.241805539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bcvkx" (UID: "0dfa190e-0f34-44c5-a71e-ed3a9f7939de") : secret "samples-operator-tls" not found Apr 24 21:16:51.836074 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.836008 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:51.836114 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.836080 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs podName:a6ad0fc1-fbd1-4133-8616-3b950995f8e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:23.836061953 +0000 UTC m=+66.241903395 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs") pod "network-metrics-daemon-h5m79" (UID: "a6ad0fc1-fbd1-4133-8616-3b950995f8e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:51.936618 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.936525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:51.936799 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.936676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:51.936799 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.936741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:51.936799 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.936753 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:51.936944 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.936822 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:51.936944 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.936834 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls podName:42e99775-4de5-4bed-b01a-a3218d41d996 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.936812701 +0000 UTC m=+35.342654146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls") pod "dns-default-nvpkh" (UID: "42e99775-4de5-4bed-b01a-a3218d41d996") : secret "dns-default-metrics-tls" not found Apr 24 21:16:51.936944 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.936878 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls podName:32f8c25b-fb1f-4a40-b2ee-4f7db45184f1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.936865086 +0000 UTC m=+35.342706516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47dvk" (UID: "32f8c25b-fb1f-4a40-b2ee-4f7db45184f1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:51.936944 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.936922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:51.937126 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.937025 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:51.937126 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:51.937060 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert podName:78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.937050477 +0000 UTC m=+35.342891906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert") pod "ingress-canary-m4spp" (UID: "78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77") : secret "canary-serving-cert" not found Apr 24 21:16:51.940261 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:51.940237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9bj\" (UniqueName: \"kubernetes.io/projected/89ab8923-5f3a-4535-9d3f-e72f739904d4-kube-api-access-jg9bj\") pod \"network-check-target-tzpnt\" (UID: \"89ab8923-5f3a-4535-9d3f-e72f739904d4\") " pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:52.211210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.211123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:52.211718 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.211243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:16:52.211718 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.211273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:52.218225 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.218078 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:16:52.218415 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.218242 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:52.218415 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.218249 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jphgp\"" Apr 24 21:16:52.218415 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.218330 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w24qg\"" Apr 24 21:16:52.240426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.240405 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:52.387609 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.387544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" event={"ID":"62a47369-6a4f-4ac0-ae3b-559fb4cadc0d","Type":"ContainerStarted","Data":"4f62df3b53d026f5f3dcff3f5dbca3183adccea7977d403d50401016aac33d66"} Apr 24 21:16:52.389299 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.389245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" event={"ID":"e7d157a6-5982-4a38-b8d0-15d88309963a","Type":"ContainerStarted","Data":"01c380ef04391f05e15834ce2f54582c682f203e52628e6c4886ebddf4ee02e2"} Apr 24 21:16:52.391564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.391507 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" event={"ID":"d1b2baba-a138-4778-ad36-d2c72cf4b2d6","Type":"ContainerStarted","Data":"a03877a5501c8579f3bc55099732503e3e299acb4124a84a9c35918538b27192"} Apr 24 21:16:52.393295 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.393267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" event={"ID":"16c39428-4288-4a12-9c01-4c9d16b18faa","Type":"ContainerStarted","Data":"2201509ec6fe1c9c9c1b3d96b7703067a9b1e08f254e05a383b7a6d158a7aaed"} Apr 24 21:16:52.398544 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.397434 2573 generic.go:358] "Generic (PLEG): container finished" podID="239caad5-0402-47f0-8e15-7f5d02343638" containerID="1ab7036bdc316d916eeec9899a2a101aa816a60d6c4d65bca4c308340e3c5460" exitCode=0 Apr 24 21:16:52.398544 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.397528 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerDied","Data":"1ab7036bdc316d916eeec9899a2a101aa816a60d6c4d65bca4c308340e3c5460"} Apr 24 21:16:52.402153 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.402088 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tzpnt"] Apr 24 21:16:52.402773 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.402668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" event={"ID":"f9353274-ce1e-479b-a277-0a36a39b6fb2","Type":"ContainerStarted","Data":"4cafb64122d3add2eb6bb8c421b8ac616325bd28479129859f9d42a55acc850d"} Apr 24 21:16:52.407247 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.405856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" event={"ID":"36a65b6d-1c50-425c-911a-eb5c1059cd95","Type":"ContainerStarted","Data":"27e2aea47c65e373b81bbe1635fea6d295b7b1796aba7f43a875ded69b6831ce"} Apr 24 21:16:52.644933 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.644826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:52.644933 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.644921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:52.645063 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.645025 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:52.645112 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.645101 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.645078998 +0000 UTC m=+37.050920443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : secret "router-metrics-certs-default" not found Apr 24 21:16:52.645170 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.645163 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.645146195 +0000 UTC m=+37.050987641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:52.746884 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.746618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:52.747063 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.746932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:52.747181 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.747092 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:52.747181 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.747156 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert podName:fe0af93f-e6da-459a-b345-6cf8c4bcff2f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.747135977 +0000 UTC m=+37.152977418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6jmdp" (UID: "fe0af93f-e6da-459a-b345-6cf8c4bcff2f") : secret "networking-console-plugin-cert" not found Apr 24 21:16:52.747602 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.747584 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:52.747690 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.747606 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6dcc796c9-ngqbf: secret "image-registry-tls" not found Apr 24 21:16:52.747690 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.747651 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls podName:707d4a93-e9f1-4763-bb75-86589c7e8b18 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.747635226 +0000 UTC m=+37.153476672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls") pod "image-registry-6dcc796c9-ngqbf" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18") : secret "image-registry-tls" not found Apr 24 21:16:52.848317 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.848278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:52.848516 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.848455 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:52.848577 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.848518 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls podName:0dfa190e-0f34-44c5-a71e-ed3a9f7939de nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.848500211 +0000 UTC m=+37.254341644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bcvkx" (UID: "0dfa190e-0f34-44c5-a71e-ed3a9f7939de") : secret "samples-operator-tls" not found Apr 24 21:16:52.949666 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.949634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:52.949838 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.949728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:52.949838 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:52.949755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:52.949956 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.949857 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:52.949956 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.949903 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls podName:42e99775-4de5-4bed-b01a-a3218d41d996 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.949889565 +0000 UTC m=+37.355730994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls") pod "dns-default-nvpkh" (UID: "42e99775-4de5-4bed-b01a-a3218d41d996") : secret "dns-default-metrics-tls" not found Apr 24 21:16:52.950105 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.950089 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:52.950178 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.950106 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:52.950178 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.950136 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert podName:78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.950123085 +0000 UTC m=+37.355964514 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert") pod "ingress-canary-m4spp" (UID: "78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77") : secret "canary-serving-cert" not found Apr 24 21:16:52.950178 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:52.950153 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls podName:32f8c25b-fb1f-4a40-b2ee-4f7db45184f1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.950143692 +0000 UTC m=+37.355985123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47dvk" (UID: "32f8c25b-fb1f-4a40-b2ee-4f7db45184f1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:53.422094 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:53.422011 2573 generic.go:358] "Generic (PLEG): container finished" podID="239caad5-0402-47f0-8e15-7f5d02343638" containerID="e7ea0c8d411a6bb94157bd1acb9e140c97da89f18584cead598143745a9ed09d" exitCode=0 Apr 24 21:16:53.422546 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:53.422118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerDied","Data":"e7ea0c8d411a6bb94157bd1acb9e140c97da89f18584cead598143745a9ed09d"} Apr 24 21:16:53.426428 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:53.426395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tzpnt" event={"ID":"89ab8923-5f3a-4535-9d3f-e72f739904d4","Type":"ContainerStarted","Data":"35bbbdb557b904ca858d400e1b348b425b4b31b0e5b2bd94d9678c2061be0508"} Apr 24 21:16:54.671445 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.671401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:54.671919 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.671521 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:54.671919 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.671565 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:54.671919 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.671658 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.671636415 +0000 UTC m=+41.077477863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : secret "router-metrics-certs-default" not found Apr 24 21:16:54.671919 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.671691 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.671674989 +0000 UTC m=+41.077516458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:54.772144 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.772108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:54.772326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.772181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:54.772326 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.772292 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:54.772488 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.772295 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:54.772488 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.772383 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert podName:fe0af93f-e6da-459a-b345-6cf8c4bcff2f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.7723471 +0000 UTC m=+41.178188551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6jmdp" (UID: "fe0af93f-e6da-459a-b345-6cf8c4bcff2f") : secret "networking-console-plugin-cert" not found Apr 24 21:16:54.772488 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.772388 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6dcc796c9-ngqbf: secret "image-registry-tls" not found Apr 24 21:16:54.772488 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.772452 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls podName:707d4a93-e9f1-4763-bb75-86589c7e8b18 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.772431708 +0000 UTC m=+41.178273141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls") pod "image-registry-6dcc796c9-ngqbf" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18") : secret "image-registry-tls" not found Apr 24 21:16:54.873764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.873729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:54.873953 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.873935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:54.874104 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.874064 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:54.874167 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.874151 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls podName:0dfa190e-0f34-44c5-a71e-ed3a9f7939de nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.874129005 +0000 UTC m=+41.279970449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bcvkx" (UID: "0dfa190e-0f34-44c5-a71e-ed3a9f7939de") : secret "samples-operator-tls" not found Apr 24 21:16:54.877371 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.877328 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38a98561-29f3-47af-9151-b0d0095b287e-original-pull-secret\") pod \"global-pull-secret-syncer-sbxgn\" (UID: \"38a98561-29f3-47af-9151-b0d0095b287e\") " pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:54.925172 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.925089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sbxgn" Apr 24 21:16:54.975228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.975189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:54.975447 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.975247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:54.975447 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:54.975313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:54.975447 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.975386 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:54.975447 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.975438 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:54.975624 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.975482 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:54.975624 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.975506 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert podName:78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.975444342 +0000 UTC m=+41.381285799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert") pod "ingress-canary-m4spp" (UID: "78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77") : secret "canary-serving-cert" not found Apr 24 21:16:54.975624 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.975531 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls podName:32f8c25b-fb1f-4a40-b2ee-4f7db45184f1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.975515036 +0000 UTC m=+41.381356480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47dvk" (UID: "32f8c25b-fb1f-4a40-b2ee-4f7db45184f1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:54.975624 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:54.975549 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls podName:42e99775-4de5-4bed-b01a-a3218d41d996 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.975539992 +0000 UTC m=+41.381381423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls") pod "dns-default-nvpkh" (UID: "42e99775-4de5-4bed-b01a-a3218d41d996") : secret "dns-default-metrics-tls" not found Apr 24 21:16:58.559804 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:58.559741 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sbxgn"] Apr 24 21:16:58.570663 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:16:58.570627 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a98561_29f3_47af_9151_b0d0095b287e.slice/crio-5320d58733721efb570f98bfabd8b8086249c4463f8496807df8f996b16675c5 WatchSource:0}: Error finding container 5320d58733721efb570f98bfabd8b8086249c4463f8496807df8f996b16675c5: Status 404 returned error can't find the container with id 5320d58733721efb570f98bfabd8b8086249c4463f8496807df8f996b16675c5 Apr 24 21:16:58.708902 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:58.708863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:58.709051 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:58.708963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:16:58.709111 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.709042 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:58.709166 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.709110 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:06.709092671 +0000 UTC m=+49.114934103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:58.709166 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.709130 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:06.709120865 +0000 UTC m=+49.114962298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : secret "router-metrics-certs-default" not found Apr 24 21:16:58.809742 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:58.809700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:16:58.809840 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:58.809779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:16:58.810026 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.810000 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:58.810105 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.810070 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert podName:fe0af93f-e6da-459a-b345-6cf8c4bcff2f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:06.810049589 +0000 UTC m=+49.215891031 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6jmdp" (UID: "fe0af93f-e6da-459a-b345-6cf8c4bcff2f") : secret "networking-console-plugin-cert" not found Apr 24 21:16:58.810169 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.810143 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:58.810169 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.810157 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6dcc796c9-ngqbf: secret "image-registry-tls" not found Apr 24 21:16:58.810263 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.810193 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls podName:707d4a93-e9f1-4763-bb75-86589c7e8b18 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:06.810179703 +0000 UTC m=+49.216021135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls") pod "image-registry-6dcc796c9-ngqbf" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18") : secret "image-registry-tls" not found Apr 24 21:16:58.910769 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:58.910695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:16:58.910924 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.910885 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:58.910976 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:58.910967 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls podName:0dfa190e-0f34-44c5-a71e-ed3a9f7939de nodeName:}" failed. No retries permitted until 2026-04-24 21:17:06.910947171 +0000 UTC m=+49.316788607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bcvkx" (UID: "0dfa190e-0f34-44c5-a71e-ed3a9f7939de") : secret "samples-operator-tls" not found Apr 24 21:16:59.011750 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.011700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:16:59.012019 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.011800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:16:59.012019 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.011876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:16:59.012019 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:59.011905 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:59.012019 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:59.011982 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert podName:78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:07.011958008 +0000 UTC m=+49.417799449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert") pod "ingress-canary-m4spp" (UID: "78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77") : secret "canary-serving-cert" not found Apr 24 21:16:59.012019 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:59.012009 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:59.012284 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:59.012072 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls podName:32f8c25b-fb1f-4a40-b2ee-4f7db45184f1 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:07.012053805 +0000 UTC m=+49.417895249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47dvk" (UID: "32f8c25b-fb1f-4a40-b2ee-4f7db45184f1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:59.012284 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:59.012128 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:59.012284 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:16:59.012166 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls podName:42e99775-4de5-4bed-b01a-a3218d41d996 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:07.012155875 +0000 UTC m=+49.417997311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls") pod "dns-default-nvpkh" (UID: "42e99775-4de5-4bed-b01a-a3218d41d996") : secret "dns-default-metrics-tls" not found Apr 24 21:16:59.445700 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.444807 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cph25" event={"ID":"239caad5-0402-47f0-8e15-7f5d02343638","Type":"ContainerStarted","Data":"dd21ae9ff1a90d1693e91c48cbb9895d03d5939ee44ab7633b9daeb5acca6499"} Apr 24 21:16:59.446701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.446643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sbxgn" event={"ID":"38a98561-29f3-47af-9151-b0d0095b287e","Type":"ContainerStarted","Data":"5320d58733721efb570f98bfabd8b8086249c4463f8496807df8f996b16675c5"} Apr 24 21:16:59.447988 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.447917 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tzpnt" event={"ID":"89ab8923-5f3a-4535-9d3f-e72f739904d4","Type":"ContainerStarted","Data":"04f0c27e733dc712d74e34c5885bbbc6698414761a093e77b8fac6b71d66cd4c"} Apr 24 21:16:59.448439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.448421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:16:59.453099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.452755 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/0.log" Apr 24 21:16:59.453099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.452793 2573 generic.go:358] "Generic (PLEG): container finished" podID="f9353274-ce1e-479b-a277-0a36a39b6fb2" containerID="309e55d49f0173364d63f9960cb1dd3aa2f108f950f1b0e565e414476b90d3f3" exitCode=255 Apr 24 21:16:59.453099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.452877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" event={"ID":"f9353274-ce1e-479b-a277-0a36a39b6fb2","Type":"ContainerDied","Data":"309e55d49f0173364d63f9960cb1dd3aa2f108f950f1b0e565e414476b90d3f3"} Apr 24 21:16:59.453315 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.453287 2573 scope.go:117] "RemoveContainer" containerID="309e55d49f0173364d63f9960cb1dd3aa2f108f950f1b0e565e414476b90d3f3" Apr 24 21:16:59.459389 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.458553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" event={"ID":"36a65b6d-1c50-425c-911a-eb5c1059cd95","Type":"ContainerStarted","Data":"ecf16be95fd51d6b80ae599fe75b5301e6048ec7c7731e5721885032b27f770a"} Apr 24 21:16:59.462911 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.462787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" event={"ID":"62a47369-6a4f-4ac0-ae3b-559fb4cadc0d","Type":"ContainerStarted","Data":"b97cc008f76bd85e67cd762b01b641085e726b01ee99c5cf651050dd69915768"} Apr 24 21:16:59.464878 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.464849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" event={"ID":"e7d157a6-5982-4a38-b8d0-15d88309963a","Type":"ContainerStarted","Data":"f41248d4c34c178cb1074bb1e092868253efe6f822c05b2ee8494eb4a927ece2"} Apr 24 21:16:59.467454 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.467410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" event={"ID":"d1b2baba-a138-4778-ad36-d2c72cf4b2d6","Type":"ContainerStarted","Data":"5d92c45bf473d40684897cc37c59832328d72449749ce7f74d88edac24716ad5"} Apr 24 21:16:59.469249 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.469225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" event={"ID":"16c39428-4288-4a12-9c01-4c9d16b18faa","Type":"ContainerStarted","Data":"c6a4126f80746bd765cd6fd8c9a1112607bb01eb4106306f2fae1a53066ce1b1"} Apr 24 21:16:59.480635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.479567 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cph25" podStartSLOduration=10.873104703 podStartE2EDuration="41.479549671s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:20.821881731 +0000 UTC m=+3.227723170" lastFinishedPulling="2026-04-24 21:16:51.428326687 +0000 UTC m=+33.834168138" observedRunningTime="2026-04-24 21:16:59.478001678 +0000 UTC m=+41.883843130" watchObservedRunningTime="2026-04-24 21:16:59.479549671 +0000 UTC m=+41.885391124" Apr 24 21:16:59.493273 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.493229 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tzpnt" podStartSLOduration=35.232398016 podStartE2EDuration="41.493214674s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:16:52.413414898 +0000 UTC m=+34.819256348" lastFinishedPulling="2026-04-24 21:16:58.674231557 +0000 UTC m=+41.080073006" observedRunningTime="2026-04-24 21:16:59.4921703 +0000 UTC m=+41.898011754" watchObservedRunningTime="2026-04-24 21:16:59.493214674 +0000 UTC m=+41.899056124" Apr 24 21:16:59.508203 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.506927 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lh2r9" podStartSLOduration=24.799382749 podStartE2EDuration="31.506909731s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.679041294 +0000 UTC m=+34.084882733" lastFinishedPulling="2026-04-24 21:16:58.386568286 +0000 UTC m=+40.792409715" observedRunningTime="2026-04-24 21:16:59.506336038 +0000 UTC m=+41.912177489" watchObservedRunningTime="2026-04-24 21:16:59.506909731 +0000 UTC m=+41.912751182" Apr 24 21:16:59.527101 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.527053 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" podStartSLOduration=24.796215659 podStartE2EDuration="31.527037319s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.676306002 +0000 UTC m=+34.082147432" lastFinishedPulling="2026-04-24 21:16:58.407127659 +0000 UTC m=+40.812969092" observedRunningTime="2026-04-24 21:16:59.526064884 +0000 UTC m=+41.931906337" watchObservedRunningTime="2026-04-24 21:16:59.527037319 +0000 UTC m=+41.932879169" Apr 24 21:16:59.577195 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.575644 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" podStartSLOduration=24.824878673 podStartE2EDuration="31.575624804s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.655763664 +0000 UTC m=+34.061605115" lastFinishedPulling="2026-04-24 21:16:58.406509814 +0000 UTC m=+40.812351246" observedRunningTime="2026-04-24 21:16:59.544671681 +0000 UTC m=+41.950513134" watchObservedRunningTime="2026-04-24 21:16:59.575624804 +0000 UTC m=+41.981466256" Apr 24 21:16:59.577195 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.576775 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" podStartSLOduration=24.868798107 podStartE2EDuration="31.576760065s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.698618639 +0000 UTC m=+34.104460072" lastFinishedPulling="2026-04-24 21:16:58.406580597 +0000 UTC m=+40.812422030" observedRunningTime="2026-04-24 21:16:59.57491906 +0000 UTC m=+41.980760512" watchObservedRunningTime="2026-04-24 21:16:59.576760065 +0000 UTC m=+41.982601518" Apr 24 21:16:59.605329 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:16:59.605266 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mctfb" podStartSLOduration=24.646035314 podStartE2EDuration="31.605246947s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.726818801 +0000 UTC m=+34.132660234" lastFinishedPulling="2026-04-24 21:16:58.686030438 +0000 UTC m=+41.091871867" observedRunningTime="2026-04-24 21:16:59.603491062 +0000 UTC m=+42.009332514" watchObservedRunningTime="2026-04-24 21:16:59.605246947 +0000 UTC m=+42.011088398" Apr 24 21:17:00.475030 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:00.474998 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:17:00.475646 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:00.475422 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/0.log" Apr 24 21:17:00.475646 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:00.475462 2573 generic.go:358] "Generic (PLEG): container finished" podID="f9353274-ce1e-479b-a277-0a36a39b6fb2" containerID="c0e5eb74b991ea66b4fa13085f9ae74eb26184262eee9c48113a835f3a472937" exitCode=255 Apr 24 21:17:00.476114 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:00.476080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" event={"ID":"f9353274-ce1e-479b-a277-0a36a39b6fb2","Type":"ContainerDied","Data":"c0e5eb74b991ea66b4fa13085f9ae74eb26184262eee9c48113a835f3a472937"} Apr 24 21:17:00.476218 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:00.476136 2573 scope.go:117] "RemoveContainer" containerID="309e55d49f0173364d63f9960cb1dd3aa2f108f950f1b0e565e414476b90d3f3" Apr 24 21:17:00.477245 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:00.477227 2573 scope.go:117] "RemoveContainer" containerID="c0e5eb74b991ea66b4fa13085f9ae74eb26184262eee9c48113a835f3a472937" Apr 24 21:17:00.478109 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:00.477702 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t99mx_openshift-console-operator(f9353274-ce1e-479b-a277-0a36a39b6fb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" podUID="f9353274-ce1e-479b-a277-0a36a39b6fb2" Apr 24 21:17:01.449203 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:01.449168 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:17:01.449203 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:01.449211 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:17:01.481087 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:01.481060 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:17:01.481505 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:01.481485 2573 scope.go:117] "RemoveContainer" containerID="c0e5eb74b991ea66b4fa13085f9ae74eb26184262eee9c48113a835f3a472937" Apr 24 21:17:01.481707 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:01.481686 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t99mx_openshift-console-operator(f9353274-ce1e-479b-a277-0a36a39b6fb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" podUID="f9353274-ce1e-479b-a277-0a36a39b6fb2" Apr 24 21:17:02.484422 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:02.484396 2573 scope.go:117] "RemoveContainer" containerID="c0e5eb74b991ea66b4fa13085f9ae74eb26184262eee9c48113a835f3a472937" Apr 24 21:17:02.484871 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:02.484636 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t99mx_openshift-console-operator(f9353274-ce1e-479b-a277-0a36a39b6fb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" podUID="f9353274-ce1e-479b-a277-0a36a39b6fb2" Apr 24 21:17:02.643644 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:02.643610 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cstvp_31fbbb71-5394-4f60-8de2-cc5dc970ab35/dns-node-resolver/0.log" Apr 24 21:17:03.488173 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:03.488139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sbxgn" event={"ID":"38a98561-29f3-47af-9151-b0d0095b287e","Type":"ContainerStarted","Data":"60a1b221c984a26d9e4c9968ea104839a21592ea505bdb977f1642ab8be945c2"} Apr 24 21:17:03.503602 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:03.503553 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-sbxgn" podStartSLOduration=37.060520004 podStartE2EDuration="41.503539962s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:58.596793983 +0000 UTC m=+41.002635412" lastFinishedPulling="2026-04-24 21:17:03.039813926 +0000 UTC m=+45.445655370" observedRunningTime="2026-04-24 21:17:03.503078603 +0000 UTC m=+45.908920055" watchObservedRunningTime="2026-04-24 21:17:03.503539962 +0000 UTC m=+45.909381411" Apr 24 21:17:03.643201 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:03.643173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9crbq_88a73d58-a99e-49c1-9821-a06593a8b35e/node-ca/0.log" Apr 24 21:17:06.788481 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:06.788439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:06.788931 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:06.788517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:06.788931 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.788588 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:06.788931 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.788647 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:22.78863495 +0000 UTC m=+65.194476380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:06.788931 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.788663 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs podName:5f412e6f-9e0c-44f5-b798-012969c57865 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:22.788657483 +0000 UTC m=+65.194498913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs") pod "router-default-74fb58c7f4-9dgzg" (UID: "5f412e6f-9e0c-44f5-b798-012969c57865") : secret "router-metrics-certs-default" not found Apr 24 21:17:06.889134 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:06.889092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:06.889317 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:06.889161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:17:06.889317 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.889247 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:06.889317 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.889265 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6dcc796c9-ngqbf: secret "image-registry-tls" not found Apr 24 21:17:06.889317 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.889301 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:17:06.889317 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.889319 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls podName:707d4a93-e9f1-4763-bb75-86589c7e8b18 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:22.889303983 +0000 UTC m=+65.295145412 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls") pod "image-registry-6dcc796c9-ngqbf" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18") : secret "image-registry-tls" not found Apr 24 21:17:06.889512 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.889348 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert podName:fe0af93f-e6da-459a-b345-6cf8c4bcff2f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:22.889336137 +0000 UTC m=+65.295177566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6jmdp" (UID: "fe0af93f-e6da-459a-b345-6cf8c4bcff2f") : secret "networking-console-plugin-cert" not found Apr 24 21:17:06.989553 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:06.989520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:17:06.989716 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.989661 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:06.989759 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:06.989725 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls podName:0dfa190e-0f34-44c5-a71e-ed3a9f7939de nodeName:}" failed. No retries permitted until 2026-04-24 21:17:22.989710759 +0000 UTC m=+65.395552189 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bcvkx" (UID: "0dfa190e-0f34-44c5-a71e-ed3a9f7939de") : secret "samples-operator-tls" not found Apr 24 21:17:07.090683 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:07.090590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:17:07.090683 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:07.090657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:07.090725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:07.090764 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:07.090792 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:07.090807 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:07.090844 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls podName:42e99775-4de5-4bed-b01a-a3218d41d996 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:23.090824236 +0000 UTC m=+65.496665680 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls") pod "dns-default-nvpkh" (UID: "42e99775-4de5-4bed-b01a-a3218d41d996") : secret "dns-default-metrics-tls" not found Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:07.090859 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls podName:32f8c25b-fb1f-4a40-b2ee-4f7db45184f1 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:23.090853415 +0000 UTC m=+65.496694844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47dvk" (UID: "32f8c25b-fb1f-4a40-b2ee-4f7db45184f1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:07.090882 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:07.090869 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert podName:78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:23.090864059 +0000 UTC m=+65.496705489 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert") pod "ingress-canary-m4spp" (UID: "78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77") : secret "canary-serving-cert" not found Apr 24 21:17:17.206329 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:17.206295 2573 scope.go:117] "RemoveContainer" containerID="c0e5eb74b991ea66b4fa13085f9ae74eb26184262eee9c48113a835f3a472937" Apr 24 21:17:17.530645 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:17.530520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:17:17.530645 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:17.530599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" event={"ID":"f9353274-ce1e-479b-a277-0a36a39b6fb2","Type":"ContainerStarted","Data":"b32d7338283da5c67fa4eda19cdcb1a23ac879fa3319d95ad2d4f3ab38db6f7c"} Apr 24 21:17:17.530951 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:17.530934 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:17:17.558725 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:17.558671 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" podStartSLOduration=42.851521072 podStartE2EDuration="49.558656172s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.702589464 +0000 UTC m=+34.108430893" lastFinishedPulling="2026-04-24 21:16:58.40972456 +0000 UTC m=+40.815565993" observedRunningTime="2026-04-24 21:17:17.556779569 +0000 UTC m=+59.962621021" watchObservedRunningTime="2026-04-24 21:17:17.558656172 +0000 UTC m=+59.964497654" Apr 24 21:17:17.964223 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:17.964144 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-t99mx" Apr 24 21:17:18.390614 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:18.390586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz2jk" Apr 24 21:17:22.830783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.830741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:22.831155 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.830823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:22.831458 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.831436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f412e6f-9e0c-44f5-b798-012969c57865-service-ca-bundle\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:22.833177 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.833154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f412e6f-9e0c-44f5-b798-012969c57865-metrics-certs\") pod \"router-default-74fb58c7f4-9dgzg\" (UID: \"5f412e6f-9e0c-44f5-b798-012969c57865\") " pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:22.931684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.931633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:22.931684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.931702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:17:22.934402 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.934376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe0af93f-e6da-459a-b345-6cf8c4bcff2f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6jmdp\" (UID: \"fe0af93f-e6da-459a-b345-6cf8c4bcff2f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:17:22.934528 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:22.934401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"image-registry-6dcc796c9-ngqbf\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:23.032344 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.032298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:17:23.034767 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.034743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0dfa190e-0f34-44c5-a71e-ed3a9f7939de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bcvkx\" (UID: \"0dfa190e-0f34-44c5-a71e-ed3a9f7939de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:17:23.102102 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.102020 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-frq8p\"" Apr 24 21:17:23.107159 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.107135 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jgvbr\"" Apr 24 21:17:23.110031 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.110014 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:23.115856 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.115829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:23.116809 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.116793 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2p2ck\"" Apr 24 21:17:23.125655 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.125628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" Apr 24 21:17:23.133676 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.133643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:17:23.133806 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.133745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:17:23.133806 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.133785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:17:23.136190 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.136147 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42e99775-4de5-4bed-b01a-a3218d41d996-metrics-tls\") pod \"dns-default-nvpkh\" (UID: \"42e99775-4de5-4bed-b01a-a3218d41d996\") " pod="openshift-dns/dns-default-nvpkh" Apr 24 21:17:23.136309 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.136293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77-cert\") pod \"ingress-canary-m4spp\" (UID: \"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77\") " pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:17:23.136725 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.136699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/32f8c25b-fb1f-4a40-b2ee-4f7db45184f1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47dvk\" (UID: \"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:17:23.192054 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.191803 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mkvnq\"" Apr 24 21:17:23.200841 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.200445 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" Apr 24 21:17:23.279273 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.279202 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74fb58c7f4-9dgzg"] Apr 24 21:17:23.281130 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:23.281093 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f412e6f_9e0c_44f5_b798_012969c57865.slice/crio-a915fe629878f6e3c69e217e4457189c13b5c5f30008333388326686baaf1d57 WatchSource:0}: Error finding container a915fe629878f6e3c69e217e4457189c13b5c5f30008333388326686baaf1d57: Status 404 returned error can't find the container with id a915fe629878f6e3c69e217e4457189c13b5c5f30008333388326686baaf1d57 Apr 24 21:17:23.281278 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.281224 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cvtf7\"" Apr 24 21:17:23.286470 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.286351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6dcc796c9-ngqbf"] Apr 24 21:17:23.288899 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.288874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" Apr 24 21:17:23.289258 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:23.289234 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707d4a93_e9f1_4763_bb75_86589c7e8b18.slice/crio-55857f707f636f6a6c71cf2ffc6fba18a8f7fe38ead14881057a9af013a42709 WatchSource:0}: Error finding container 55857f707f636f6a6c71cf2ffc6fba18a8f7fe38ead14881057a9af013a42709: Status 404 returned error can't find the container with id 55857f707f636f6a6c71cf2ffc6fba18a8f7fe38ead14881057a9af013a42709 Apr 24 21:17:23.301644 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.301616 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp"] Apr 24 21:17:23.305859 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:23.305824 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe0af93f_e6da_459a_b345_6cf8c4bcff2f.slice/crio-af91093d90351f8ec2914ac7a73f1e2eb180698345534bc8c2dd1204951ff6af WatchSource:0}: Error finding container af91093d90351f8ec2914ac7a73f1e2eb180698345534bc8c2dd1204951ff6af: Status 404 returned error can't find the container with id af91093d90351f8ec2914ac7a73f1e2eb180698345534bc8c2dd1204951ff6af Apr 24 21:17:23.319433 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.319404 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-znhx4\"" Apr 24 21:17:23.327540 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.327232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4spp" Apr 24 21:17:23.333017 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.332994 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-86llq\"" Apr 24 21:17:23.340978 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.340943 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nvpkh" Apr 24 21:17:23.349266 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.349237 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx"] Apr 24 21:17:23.487095 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.487051 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk"] Apr 24 21:17:23.493389 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:23.493331 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f8c25b_fb1f_4a40_b2ee_4f7db45184f1.slice/crio-248d10d5bb96d04153352ac3513fe4c2816054d45a185265030a1ab100b5d253 WatchSource:0}: Error finding container 248d10d5bb96d04153352ac3513fe4c2816054d45a185265030a1ab100b5d253: Status 404 returned error can't find the container with id 248d10d5bb96d04153352ac3513fe4c2816054d45a185265030a1ab100b5d253 Apr 24 21:17:23.510445 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.510412 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4spp"] Apr 24 21:17:23.515439 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:23.515407 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78bcd5aa_9db8_46f4_ba2e_f6f3d929aa77.slice/crio-df872670092e12bdb1f8bb9703ca3d51e257c654ee41df0ba504caf40529cd31 WatchSource:0}: Error finding container df872670092e12bdb1f8bb9703ca3d51e257c654ee41df0ba504caf40529cd31: Status 404 returned error can't find the container with id df872670092e12bdb1f8bb9703ca3d51e257c654ee41df0ba504caf40529cd31 Apr 24 21:17:23.538759 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.538720 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nvpkh"] Apr 24 21:17:23.547137 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.547103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" event={"ID":"0dfa190e-0f34-44c5-a71e-ed3a9f7939de","Type":"ContainerStarted","Data":"00c4f9dfd2235087d53a349a357846b1092596c8f3abc73f40c434ffedd164ee"} Apr 24 21:17:23.548645 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.548590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" event={"ID":"5f412e6f-9e0c-44f5-b798-012969c57865","Type":"ContainerStarted","Data":"e8b4b4e838d2831d35e958233f042e2f5dc5a058750fc80dcc559f25aade97ec"} Apr 24 21:17:23.548645 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.548633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" event={"ID":"5f412e6f-9e0c-44f5-b798-012969c57865","Type":"ContainerStarted","Data":"a915fe629878f6e3c69e217e4457189c13b5c5f30008333388326686baaf1d57"} Apr 24 21:17:23.550240 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.550164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nvpkh" event={"ID":"42e99775-4de5-4bed-b01a-a3218d41d996","Type":"ContainerStarted","Data":"430b7d1fc3ae3249a6b818045d94da97ed581322e02e7c1f3072308ea1f96a94"} Apr 24 21:17:23.551376 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.551339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" event={"ID":"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1","Type":"ContainerStarted","Data":"248d10d5bb96d04153352ac3513fe4c2816054d45a185265030a1ab100b5d253"} Apr 24 21:17:23.552668 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.552636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4spp" event={"ID":"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77","Type":"ContainerStarted","Data":"df872670092e12bdb1f8bb9703ca3d51e257c654ee41df0ba504caf40529cd31"} Apr 24 21:17:23.553743 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.553704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" event={"ID":"fe0af93f-e6da-459a-b345-6cf8c4bcff2f","Type":"ContainerStarted","Data":"af91093d90351f8ec2914ac7a73f1e2eb180698345534bc8c2dd1204951ff6af"} Apr 24 21:17:23.555109 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.555081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" event={"ID":"707d4a93-e9f1-4763-bb75-86589c7e8b18","Type":"ContainerStarted","Data":"03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5"} Apr 24 21:17:23.555109 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.555107 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" event={"ID":"707d4a93-e9f1-4763-bb75-86589c7e8b18","Type":"ContainerStarted","Data":"55857f707f636f6a6c71cf2ffc6fba18a8f7fe38ead14881057a9af013a42709"} Apr 24 21:17:23.555236 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.555221 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:23.568095 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.568037 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" podStartSLOduration=55.568018871 podStartE2EDuration="55.568018871s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:23.567405044 +0000 UTC m=+65.973246494" watchObservedRunningTime="2026-04-24 21:17:23.568018871 +0000 UTC m=+65.973860324" Apr 24 21:17:23.588277 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.588219 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" podStartSLOduration=55.5882019 podStartE2EDuration="55.5882019s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:23.587545357 +0000 UTC m=+65.993386826" watchObservedRunningTime="2026-04-24 21:17:23.5882019 +0000 UTC m=+65.994043352" Apr 24 21:17:23.841873 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.841833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:17:23.844604 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.844544 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:23.855422 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:23.855392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6ad0fc1-fbd1-4133-8616-3b950995f8e4-metrics-certs\") pod \"network-metrics-daemon-h5m79\" (UID: \"a6ad0fc1-fbd1-4133-8616-3b950995f8e4\") " pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:17:24.036186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.036151 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jphgp\"" Apr 24 21:17:24.044183 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.043718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5m79" Apr 24 21:17:24.111038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.110939 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:24.114428 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.114396 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:24.226371 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.224662 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h5m79"] Apr 24 21:17:24.560151 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.560109 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:24.561751 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:24.561574 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-74fb58c7f4-9dgzg" Apr 24 21:17:24.644930 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:24.644885 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ad0fc1_fbd1_4133_8616_3b950995f8e4.slice/crio-410bb068db8afa0ab73c1a9a234412d180d328ac33fb5599698f7ef10fb1b590 WatchSource:0}: Error finding container 410bb068db8afa0ab73c1a9a234412d180d328ac33fb5599698f7ef10fb1b590: Status 404 returned error can't find the container with id 410bb068db8afa0ab73c1a9a234412d180d328ac33fb5599698f7ef10fb1b590 Apr 24 21:17:25.563436 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:25.563382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5m79" event={"ID":"a6ad0fc1-fbd1-4133-8616-3b950995f8e4","Type":"ContainerStarted","Data":"410bb068db8afa0ab73c1a9a234412d180d328ac33fb5599698f7ef10fb1b590"} Apr 24 21:17:26.733691 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.733414 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6dcc796c9-ngqbf"] Apr 24 21:17:26.826923 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.826887 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7bz7w"] Apr 24 21:17:26.878636 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.878604 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-689c867f4b-rpl44"] Apr 24 21:17:26.878802 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.878783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:26.882797 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.882775 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2bmll\"" Apr 24 21:17:26.882966 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.882853 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:17:26.883039 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.882999 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:17:26.901571 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.901535 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-689c867f4b-rpl44"] Apr 24 21:17:26.901571 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.901566 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7bz7w"] Apr 24 21:17:26.901762 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.901710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.967707 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c6a0742-062f-4e70-97ca-7f2a1248b077-crio-socket\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:26.967707 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c6a0742-062f-4e70-97ca-7f2a1248b077-data-volume\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967739 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-registry-tls\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c6a0742-062f-4e70-97ca-7f2a1248b077-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3fed8766-c4ad-4312-ba21-25369a24b276-image-registry-private-configuration\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967836 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pznzc\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-kube-api-access-pznzc\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3fed8766-c4ad-4312-ba21-25369a24b276-installation-pull-secrets\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxllq\" (UniqueName: \"kubernetes.io/projected/7c6a0742-062f-4e70-97ca-7f2a1248b077-kube-api-access-bxllq\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:26.967942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3fed8766-c4ad-4312-ba21-25369a24b276-registry-certificates\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.968210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c6a0742-062f-4e70-97ca-7f2a1248b077-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:26.968210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.967997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3fed8766-c4ad-4312-ba21-25369a24b276-ca-trust-extracted\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.968210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.968012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fed8766-c4ad-4312-ba21-25369a24b276-trusted-ca\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:26.968210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:26.968036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-bound-sa-token\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.068676 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.068600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c6a0742-062f-4e70-97ca-7f2a1248b077-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.068676 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.068643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3fed8766-c4ad-4312-ba21-25369a24b276-ca-trust-extracted\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.068676 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.068662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fed8766-c4ad-4312-ba21-25369a24b276-trusted-ca\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.068927 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.068686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-bound-sa-token\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.068927 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.068708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c6a0742-062f-4e70-97ca-7f2a1248b077-crio-socket\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.068927 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.068724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c6a0742-062f-4e70-97ca-7f2a1248b077-data-volume\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.069185 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-registry-tls\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.069270 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c6a0742-062f-4e70-97ca-7f2a1248b077-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.069329 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3fed8766-c4ad-4312-ba21-25369a24b276-image-registry-private-configuration\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.069329 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c6a0742-062f-4e70-97ca-7f2a1248b077-crio-socket\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.069444 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c6a0742-062f-4e70-97ca-7f2a1248b077-data-volume\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.070000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pznzc\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-kube-api-access-pznzc\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.070000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3fed8766-c4ad-4312-ba21-25369a24b276-installation-pull-secrets\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.070000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxllq\" (UniqueName: \"kubernetes.io/projected/7c6a0742-062f-4e70-97ca-7f2a1248b077-kube-api-access-bxllq\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.070000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3fed8766-c4ad-4312-ba21-25369a24b276-registry-certificates\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.070000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3fed8766-c4ad-4312-ba21-25369a24b276-ca-trust-extracted\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.070000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.069930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fed8766-c4ad-4312-ba21-25369a24b276-trusted-ca\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.072209 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.070925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c6a0742-062f-4e70-97ca-7f2a1248b077-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.072209 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.071253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3fed8766-c4ad-4312-ba21-25369a24b276-registry-certificates\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.072402 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.072217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3fed8766-c4ad-4312-ba21-25369a24b276-image-registry-private-configuration\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.075107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.072657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-registry-tls\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.075107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.072747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c6a0742-062f-4e70-97ca-7f2a1248b077-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.075107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.072811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3fed8766-c4ad-4312-ba21-25369a24b276-installation-pull-secrets\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.084003 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.083977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-bound-sa-token\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.094859 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.094829 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxllq\" (UniqueName: \"kubernetes.io/projected/7c6a0742-062f-4e70-97ca-7f2a1248b077-kube-api-access-bxllq\") pod \"insights-runtime-extractor-7bz7w\" (UID: \"7c6a0742-062f-4e70-97ca-7f2a1248b077\") " pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.104885 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.104850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pznzc\" (UniqueName: \"kubernetes.io/projected/3fed8766-c4ad-4312-ba21-25369a24b276-kube-api-access-pznzc\") pod \"image-registry-689c867f4b-rpl44\" (UID: \"3fed8766-c4ad-4312-ba21-25369a24b276\") " pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:27.188371 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.188320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7bz7w" Apr 24 21:17:27.211435 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:27.211404 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:28.403010 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.402325 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-689c867f4b-rpl44"] Apr 24 21:17:28.409110 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:28.409075 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fed8766_c4ad_4312_ba21_25369a24b276.slice/crio-aa40b345ce0d96f4f5b35ad1d0521f972dc25ac4b1d79b5aacc79cd7d579078d WatchSource:0}: Error finding container aa40b345ce0d96f4f5b35ad1d0521f972dc25ac4b1d79b5aacc79cd7d579078d: Status 404 returned error can't find the container with id aa40b345ce0d96f4f5b35ad1d0521f972dc25ac4b1d79b5aacc79cd7d579078d Apr 24 21:17:28.417001 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.416963 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7bz7w"] Apr 24 21:17:28.421786 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:28.420932 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6a0742_062f_4e70_97ca_7f2a1248b077.slice/crio-ba722e81f2f1d9afef90bb09e295143756ee74ace07e516554e2bc1cc1dc6734 WatchSource:0}: Error finding container ba722e81f2f1d9afef90bb09e295143756ee74ace07e516554e2bc1cc1dc6734: Status 404 returned error can't find the container with id ba722e81f2f1d9afef90bb09e295143756ee74ace07e516554e2bc1cc1dc6734 Apr 24 21:17:28.579068 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.575125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" event={"ID":"0dfa190e-0f34-44c5-a71e-ed3a9f7939de","Type":"ContainerStarted","Data":"ffde3946982de19d777901b68bf9bea20005d3b5773e37fc615dce47e1bb6129"} Apr 24 21:17:28.581192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.581145 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nvpkh" event={"ID":"42e99775-4de5-4bed-b01a-a3218d41d996","Type":"ContainerStarted","Data":"979052e98e7607301ed55d3800d5d6ece9b021d3038a7ebc52fae78e4a8de0b5"} Apr 24 21:17:28.582451 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.582416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" event={"ID":"32f8c25b-fb1f-4a40-b2ee-4f7db45184f1","Type":"ContainerStarted","Data":"7774b20c1263c68f9dea8749df50af2429ef5a2fea325a75c550f440910e81f6"} Apr 24 21:17:28.583653 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.583607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" event={"ID":"3fed8766-c4ad-4312-ba21-25369a24b276","Type":"ContainerStarted","Data":"aa40b345ce0d96f4f5b35ad1d0521f972dc25ac4b1d79b5aacc79cd7d579078d"} Apr 24 21:17:28.584932 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.584909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4spp" event={"ID":"78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77","Type":"ContainerStarted","Data":"77b7a84d91a1f555263582f535a97c55b045478310c5389fe3eaafa24dc12deb"} Apr 24 21:17:28.586853 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.586819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" event={"ID":"fe0af93f-e6da-459a-b345-6cf8c4bcff2f","Type":"ContainerStarted","Data":"0b144c58f5a1a00d9078f81776d9e8fd14248477b3d1d05ee074795eec121c15"} Apr 24 21:17:28.587886 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.587864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7bz7w" event={"ID":"7c6a0742-062f-4e70-97ca-7f2a1248b077","Type":"ContainerStarted","Data":"ba722e81f2f1d9afef90bb09e295143756ee74ace07e516554e2bc1cc1dc6734"} Apr 24 21:17:28.601038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.600963 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47dvk" podStartSLOduration=55.868594406 podStartE2EDuration="1m0.60094831s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:17:23.499063265 +0000 UTC m=+65.904904713" lastFinishedPulling="2026-04-24 21:17:28.231417185 +0000 UTC m=+70.637258617" observedRunningTime="2026-04-24 21:17:28.600029684 +0000 UTC m=+71.005871148" watchObservedRunningTime="2026-04-24 21:17:28.60094831 +0000 UTC m=+71.006789821" Apr 24 21:17:28.624023 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.623908 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m4spp" podStartSLOduration=33.908268582 podStartE2EDuration="38.623887991s" podCreationTimestamp="2026-04-24 21:16:50 +0000 UTC" firstStartedPulling="2026-04-24 21:17:23.518021489 +0000 UTC m=+65.923862926" lastFinishedPulling="2026-04-24 21:17:28.233640896 +0000 UTC m=+70.639482335" observedRunningTime="2026-04-24 21:17:28.623454468 +0000 UTC m=+71.029295930" watchObservedRunningTime="2026-04-24 21:17:28.623887991 +0000 UTC m=+71.029729445" Apr 24 21:17:28.653784 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:28.653715 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6jmdp" podStartSLOduration=56.034516236 podStartE2EDuration="1m0.653696227s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:17:23.308165056 +0000 UTC m=+65.714006486" lastFinishedPulling="2026-04-24 21:17:27.927345044 +0000 UTC m=+70.333186477" observedRunningTime="2026-04-24 21:17:28.652765333 +0000 UTC m=+71.058606782" watchObservedRunningTime="2026-04-24 21:17:28.653696227 +0000 UTC m=+71.059537679" Apr 24 21:17:29.592541 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.592497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7bz7w" event={"ID":"7c6a0742-062f-4e70-97ca-7f2a1248b077","Type":"ContainerStarted","Data":"25b2c5b7caf0382678385813e68812ff692bdac36a48739ef9c1fa81d8a9350c"} Apr 24 21:17:29.594164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.594139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" event={"ID":"0dfa190e-0f34-44c5-a71e-ed3a9f7939de","Type":"ContainerStarted","Data":"69320717caa215c0fe3888bca8f527874934771343eb85930f2bf2ed30453e2c"} Apr 24 21:17:29.595795 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.595749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5m79" event={"ID":"a6ad0fc1-fbd1-4133-8616-3b950995f8e4","Type":"ContainerStarted","Data":"bc57eb2877710b3104b00dc4903f13c7d1f11dbfc04de46e5155773cc7ac4e35"} Apr 24 21:17:29.595907 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.595800 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5m79" event={"ID":"a6ad0fc1-fbd1-4133-8616-3b950995f8e4","Type":"ContainerStarted","Data":"c0b4505639577e9a23b1f6a71e84b4c9d81eb960a1c758e5b52add1e145dc362"} Apr 24 21:17:29.597231 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.597208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nvpkh" event={"ID":"42e99775-4de5-4bed-b01a-a3218d41d996","Type":"ContainerStarted","Data":"8af65f8f176bddccc38180b5a10b00a2f22c7e6ab894260778c6858750a2bbc5"} Apr 24 21:17:29.597375 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.597339 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nvpkh" Apr 24 21:17:29.598505 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.598479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" event={"ID":"3fed8766-c4ad-4312-ba21-25369a24b276","Type":"ContainerStarted","Data":"555f6e044ff72bbf58fef1e8028536f00647fce8bb845a09f85b3f46cc2226f1"} Apr 24 21:17:29.614635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.614572 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bcvkx" podStartSLOduration=56.797812793 podStartE2EDuration="1m1.614551721s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="2026-04-24 21:17:23.416895579 +0000 UTC m=+65.822737024" lastFinishedPulling="2026-04-24 21:17:28.233634518 +0000 UTC m=+70.639475952" observedRunningTime="2026-04-24 21:17:29.614259862 +0000 UTC m=+72.020101312" watchObservedRunningTime="2026-04-24 21:17:29.614551721 +0000 UTC m=+72.020393173" Apr 24 21:17:29.635099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.635033 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nvpkh" podStartSLOduration=33.94626785 podStartE2EDuration="38.63501175s" podCreationTimestamp="2026-04-24 21:16:51 +0000 UTC" firstStartedPulling="2026-04-24 21:17:23.544781669 +0000 UTC m=+65.950623108" lastFinishedPulling="2026-04-24 21:17:28.23352557 +0000 UTC m=+70.639367008" observedRunningTime="2026-04-24 21:17:29.633480156 +0000 UTC m=+72.039321609" watchObservedRunningTime="2026-04-24 21:17:29.63501175 +0000 UTC m=+72.040853202" Apr 24 21:17:29.650403 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.650334 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h5m79" podStartSLOduration=67.673585491 podStartE2EDuration="1m11.650320199s" podCreationTimestamp="2026-04-24 21:16:18 +0000 UTC" firstStartedPulling="2026-04-24 21:17:24.648016695 +0000 UTC m=+67.053858138" lastFinishedPulling="2026-04-24 21:17:28.624751413 +0000 UTC m=+71.030592846" observedRunningTime="2026-04-24 21:17:29.649120118 +0000 UTC m=+72.054961603" watchObservedRunningTime="2026-04-24 21:17:29.650320199 +0000 UTC m=+72.056161650" Apr 24 21:17:29.674234 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:29.674180 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" podStartSLOduration=3.674162757 podStartE2EDuration="3.674162757s" podCreationTimestamp="2026-04-24 21:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:29.673631332 +0000 UTC m=+72.079472782" watchObservedRunningTime="2026-04-24 21:17:29.674162757 +0000 UTC m=+72.080004208" Apr 24 21:17:30.607856 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:30.607802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7bz7w" event={"ID":"7c6a0742-062f-4e70-97ca-7f2a1248b077","Type":"ContainerStarted","Data":"2b114578a02a10ab50b81ba5d7d73ce69f7f5d75a67401c32d7e4acfddbf4496"} Apr 24 21:17:30.608535 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:30.608502 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:31.483804 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:31.483754 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tzpnt" Apr 24 21:17:32.614810 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:32.614772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7bz7w" event={"ID":"7c6a0742-062f-4e70-97ca-7f2a1248b077","Type":"ContainerStarted","Data":"3a7a6839a1a7ccb9566f54ed6933cb9817ab49afe04bf177cc88cd516cb8105c"} Apr 24 21:17:32.637873 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:32.637820 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7bz7w" podStartSLOduration=3.708788894 podStartE2EDuration="6.637804262s" podCreationTimestamp="2026-04-24 21:17:26 +0000 UTC" firstStartedPulling="2026-04-24 21:17:28.761193959 +0000 UTC m=+71.167035407" lastFinishedPulling="2026-04-24 21:17:31.690209339 +0000 UTC m=+74.096050775" observedRunningTime="2026-04-24 21:17:32.635612597 +0000 UTC m=+75.041454059" watchObservedRunningTime="2026-04-24 21:17:32.637804262 +0000 UTC m=+75.043645712" Apr 24 21:17:39.610096 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:39.609965 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nvpkh" Apr 24 21:17:40.466842 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.466804 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lmlrf"] Apr 24 21:17:40.470256 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.470231 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.485183 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.485161 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:40.485321 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.485160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:40.485321 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.485160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:40.487044 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.487028 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:40.508509 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.508475 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gdtmb\"" Apr 24 21:17:40.597696 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597658 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-accelerators-collector-config\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597696 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597739 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-sys\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-metrics-client-ca\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-tls\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-root\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-textfile\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.597917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-wtmp\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.598120 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.597921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728g9\" (UniqueName: \"kubernetes.io/projected/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-kube-api-access-728g9\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699082 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-accelerators-collector-config\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-sys\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-metrics-client-ca\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-tls\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-root\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699273 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-sys\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-textfile\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-wtmp\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:40.699340 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-root\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:40.699433 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-tls podName:ab6c5ffc-45c6-4018-bff6-cb0476ccbcda nodeName:}" failed. No retries permitted until 2026-04-24 21:17:41.199412092 +0000 UTC m=+83.605253535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-tls") pod "node-exporter-lmlrf" (UID: "ab6c5ffc-45c6-4018-bff6-cb0476ccbcda") : secret "node-exporter-tls" not found Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-728g9\" (UniqueName: \"kubernetes.io/projected/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-kube-api-access-728g9\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.699564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-wtmp\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.700073 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-textfile\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.700073 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-accelerators-collector-config\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.700073 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.699938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-metrics-client-ca\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.701901 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.701880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:40.726761 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:40.726726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-728g9\" (UniqueName: \"kubernetes.io/projected/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-kube-api-access-728g9\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:41.203673 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:41.203586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-tls\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:41.205991 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:41.205962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6c5ffc-45c6-4018-bff6-cb0476ccbcda-node-exporter-tls\") pod \"node-exporter-lmlrf\" (UID: \"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda\") " pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:41.379125 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:41.379089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lmlrf" Apr 24 21:17:41.388203 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:17:41.388172 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6c5ffc_45c6_4018_bff6_cb0476ccbcda.slice/crio-5d3556dc24a6eef01654f6afec79b43b30861ba2fc31aec5e10ef9713d3000f6 WatchSource:0}: Error finding container 5d3556dc24a6eef01654f6afec79b43b30861ba2fc31aec5e10ef9713d3000f6: Status 404 returned error can't find the container with id 5d3556dc24a6eef01654f6afec79b43b30861ba2fc31aec5e10ef9713d3000f6 Apr 24 21:17:41.641119 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:41.641080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlrf" event={"ID":"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda","Type":"ContainerStarted","Data":"5d3556dc24a6eef01654f6afec79b43b30861ba2fc31aec5e10ef9713d3000f6"} Apr 24 21:17:42.645419 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:42.645377 2573 generic.go:358] "Generic (PLEG): container finished" podID="ab6c5ffc-45c6-4018-bff6-cb0476ccbcda" containerID="3238543f913950263262c0e67aaf1122aa9d971a8b038e610f6bd23bc7569a93" exitCode=0 Apr 24 21:17:42.645796 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:42.645446 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlrf" event={"ID":"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda","Type":"ContainerDied","Data":"3238543f913950263262c0e67aaf1122aa9d971a8b038e610f6bd23bc7569a93"} Apr 24 21:17:43.650782 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:43.650743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlrf" event={"ID":"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda","Type":"ContainerStarted","Data":"5c83bb6e423a18aa1a38fd4f4c7af0366f3f174971680a2edd66e816d8a4d315"} Apr 24 21:17:43.650782 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:43.650787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlrf" event={"ID":"ab6c5ffc-45c6-4018-bff6-cb0476ccbcda","Type":"ContainerStarted","Data":"0417057b9454756a7a88a0a72252e9ff428bfd5a2bb1cab17569b46f2fdf6191"} Apr 24 21:17:43.808162 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:43.808095 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lmlrf" podStartSLOduration=3.105750942 podStartE2EDuration="3.808078765s" podCreationTimestamp="2026-04-24 21:17:40 +0000 UTC" firstStartedPulling="2026-04-24 21:17:41.389925218 +0000 UTC m=+83.795766647" lastFinishedPulling="2026-04-24 21:17:42.09225304 +0000 UTC m=+84.498094470" observedRunningTime="2026-04-24 21:17:43.799979415 +0000 UTC m=+86.205820892" watchObservedRunningTime="2026-04-24 21:17:43.808078765 +0000 UTC m=+86.213920216" Apr 24 21:17:46.739327 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:46.739295 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:51.615101 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:51.615067 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-689c867f4b-rpl44" Apr 24 21:17:51.758061 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:51.758013 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" podUID="707d4a93-e9f1-4763-bb75-86589c7e8b18" containerName="registry" containerID="cri-o://03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5" gracePeriod=30 Apr 24 21:17:52.005545 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.005515 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:52.091673 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091627 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-certificates\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.091876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091703 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-installation-pull-secrets\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.091876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091734 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707d4a93-e9f1-4763-bb75-86589c7e8b18-ca-trust-extracted\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.091876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091757 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.091876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091788 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2bn5\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-kube-api-access-m2bn5\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.091876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091829 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-bound-sa-token\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.091876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.091863 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-trusted-ca\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.092170 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.092097 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-image-registry-private-configuration\") pod \"707d4a93-e9f1-4763-bb75-86589c7e8b18\" (UID: \"707d4a93-e9f1-4763-bb75-86589c7e8b18\") " Apr 24 21:17:52.092170 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.092116 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:17:52.092408 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.092333 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:17:52.092408 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.092343 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-certificates\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.094703 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.094667 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:17:52.094811 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.094783 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:17:52.094811 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.094794 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-kube-api-access-m2bn5" (OuterVolumeSpecName: "kube-api-access-m2bn5") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "kube-api-access-m2bn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:17:52.094893 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.094829 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:17:52.094929 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.094899 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:17:52.100150 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.100125 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707d4a93-e9f1-4763-bb75-86589c7e8b18-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "707d4a93-e9f1-4763-bb75-86589c7e8b18" (UID: "707d4a93-e9f1-4763-bb75-86589c7e8b18"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193235 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-installation-pull-secrets\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193265 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707d4a93-e9f1-4763-bb75-86589c7e8b18-ca-trust-extracted\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193276 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-registry-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193288 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m2bn5\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-kube-api-access-m2bn5\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193296 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707d4a93-e9f1-4763-bb75-86589c7e8b18-bound-sa-token\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193305 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707d4a93-e9f1-4763-bb75-86589c7e8b18-trusted-ca\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.193318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.193313 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707d4a93-e9f1-4763-bb75-86589c7e8b18-image-registry-private-configuration\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:17:52.675713 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.675679 2573 generic.go:358] "Generic (PLEG): container finished" podID="707d4a93-e9f1-4763-bb75-86589c7e8b18" containerID="03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5" exitCode=0 Apr 24 21:17:52.676192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.675742 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" Apr 24 21:17:52.676192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.675749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" event={"ID":"707d4a93-e9f1-4763-bb75-86589c7e8b18","Type":"ContainerDied","Data":"03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5"} Apr 24 21:17:52.676192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.675789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6dcc796c9-ngqbf" event={"ID":"707d4a93-e9f1-4763-bb75-86589c7e8b18","Type":"ContainerDied","Data":"55857f707f636f6a6c71cf2ffc6fba18a8f7fe38ead14881057a9af013a42709"} Apr 24 21:17:52.676192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.675805 2573 scope.go:117] "RemoveContainer" containerID="03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5" Apr 24 21:17:52.683391 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.683372 2573 scope.go:117] "RemoveContainer" containerID="03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5" Apr 24 21:17:52.683662 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:17:52.683644 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5\": container with ID starting with 03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5 not found: ID does not exist" containerID="03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5" Apr 24 21:17:52.683705 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.683670 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5"} err="failed to get container status \"03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5\": rpc error: code = NotFound desc = could not find container \"03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5\": container with ID starting with 03f0fcf66c42edec2a48a3c54b188b697ed64aa8989aa5217df98dd697db28b5 not found: ID does not exist" Apr 24 21:17:52.706630 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.703141 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6dcc796c9-ngqbf"] Apr 24 21:17:52.713796 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:52.713763 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6dcc796c9-ngqbf"] Apr 24 21:17:54.210786 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:17:54.210746 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707d4a93-e9f1-4763-bb75-86589c7e8b18" path="/var/lib/kubelet/pods/707d4a93-e9f1-4763-bb75-86589c7e8b18/volumes" Apr 24 21:18:04.711607 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:04.711568 2573 generic.go:358] "Generic (PLEG): container finished" podID="16c39428-4288-4a12-9c01-4c9d16b18faa" containerID="c6a4126f80746bd765cd6fd8c9a1112607bb01eb4106306f2fae1a53066ce1b1" exitCode=0 Apr 24 21:18:04.712046 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:04.711638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" event={"ID":"16c39428-4288-4a12-9c01-4c9d16b18faa","Type":"ContainerDied","Data":"c6a4126f80746bd765cd6fd8c9a1112607bb01eb4106306f2fae1a53066ce1b1"} Apr 24 21:18:04.712046 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:04.711989 2573 scope.go:117] "RemoveContainer" containerID="c6a4126f80746bd765cd6fd8c9a1112607bb01eb4106306f2fae1a53066ce1b1" Apr 24 21:18:04.713077 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:04.713054 2573 generic.go:358] "Generic (PLEG): container finished" podID="36a65b6d-1c50-425c-911a-eb5c1059cd95" containerID="ecf16be95fd51d6b80ae599fe75b5301e6048ec7c7731e5721885032b27f770a" exitCode=0 Apr 24 21:18:04.713180 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:04.713085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" event={"ID":"36a65b6d-1c50-425c-911a-eb5c1059cd95","Type":"ContainerDied","Data":"ecf16be95fd51d6b80ae599fe75b5301e6048ec7c7731e5721885032b27f770a"} Apr 24 21:18:04.713426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:04.713404 2573 scope.go:117] "RemoveContainer" containerID="ecf16be95fd51d6b80ae599fe75b5301e6048ec7c7731e5721885032b27f770a" Apr 24 21:18:05.716830 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:05.716793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w27kv" event={"ID":"16c39428-4288-4a12-9c01-4c9d16b18faa","Type":"ContainerStarted","Data":"56f4c17b9038c6e32840eea108a668000331563a1e23c5db5ec727fa08b4424a"} Apr 24 21:18:05.718454 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:05.718427 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbr6c" event={"ID":"36a65b6d-1c50-425c-911a-eb5c1059cd95","Type":"ContainerStarted","Data":"1a8cb20abfd01520ceb711fba02bfc81f9a00eb5a6cefc8e83a12f7b1bb43a58"} Apr 24 21:18:29.786169 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:29.786133 2573 generic.go:358] "Generic (PLEG): container finished" podID="d1b2baba-a138-4778-ad36-d2c72cf4b2d6" containerID="5d92c45bf473d40684897cc37c59832328d72449749ce7f74d88edac24716ad5" exitCode=0 Apr 24 21:18:29.786598 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:29.786211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" event={"ID":"d1b2baba-a138-4778-ad36-d2c72cf4b2d6","Type":"ContainerDied","Data":"5d92c45bf473d40684897cc37c59832328d72449749ce7f74d88edac24716ad5"} Apr 24 21:18:29.786640 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:29.786597 2573 scope.go:117] "RemoveContainer" containerID="5d92c45bf473d40684897cc37c59832328d72449749ce7f74d88edac24716ad5" Apr 24 21:18:30.790719 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:18:30.790683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2fhhh" event={"ID":"d1b2baba-a138-4778-ad36-d2c72cf4b2d6","Type":"ContainerStarted","Data":"91c347c3853b6bea3c4d3632f0482ad01877d46fbae947d639c75f6607a210de"} Apr 24 21:21:18.077760 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:21:18.077732 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:21:18.078258 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:21:18.077732 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:21:18.086455 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:21:18.086426 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:25:35.627734 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.627698 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8"] Apr 24 21:25:35.628249 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.627999 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="707d4a93-e9f1-4763-bb75-86589c7e8b18" containerName="registry" Apr 24 21:25:35.628249 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.628010 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="707d4a93-e9f1-4763-bb75-86589c7e8b18" containerName="registry" Apr 24 21:25:35.628249 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.628056 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="707d4a93-e9f1-4763-bb75-86589c7e8b18" containerName="registry" Apr 24 21:25:35.629656 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.629641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.639484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.639465 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:25:35.640038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.640023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:25:35.640513 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.640497 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:25:35.640564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.640519 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:25:35.640715 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.640702 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gl4bn\"" Apr 24 21:25:35.640830 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.640725 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:25:35.657914 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.657876 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8"] Apr 24 21:25:35.776545 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.776503 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2508323d-2f83-44db-9bde-1caf102eba29-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.776720 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.776609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fknr\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-kube-api-access-2fknr\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.776720 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.776655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.877303 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.877270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fknr\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-kube-api-access-2fknr\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.877509 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.877313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.877509 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.877344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2508323d-2f83-44db-9bde-1caf102eba29-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.877509 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:35.877469 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:25:35.877509 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:35.877484 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:25:35.877509 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:35.877502 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8: references non-existent secret key: tls.crt Apr 24 21:25:35.877780 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:35.877555 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates podName:2508323d-2f83-44db-9bde-1caf102eba29 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:36.377538177 +0000 UTC m=+558.783379606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates") pod "keda-metrics-apiserver-7c9f485588-9bcc8" (UID: "2508323d-2f83-44db-9bde-1caf102eba29") : references non-existent secret key: tls.crt Apr 24 21:25:35.877780 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.877750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2508323d-2f83-44db-9bde-1caf102eba29-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.889759 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.889734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fknr\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-kube-api-access-2fknr\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:35.949266 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.949229 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6zmp5"] Apr 24 21:25:35.951441 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.951423 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:35.957943 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.957911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:25:35.964895 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:35.964874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6zmp5"] Apr 24 21:25:36.079133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.079099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmwq\" (UniqueName: \"kubernetes.io/projected/7f12a9e8-fed6-4fe1-a4f0-fb9793707763-kube-api-access-6hmwq\") pod \"keda-admission-cf49989db-6zmp5\" (UID: \"7f12a9e8-fed6-4fe1-a4f0-fb9793707763\") " pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.079298 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.079202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f12a9e8-fed6-4fe1-a4f0-fb9793707763-certificates\") pod \"keda-admission-cf49989db-6zmp5\" (UID: \"7f12a9e8-fed6-4fe1-a4f0-fb9793707763\") " pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.180398 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.180286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmwq\" (UniqueName: \"kubernetes.io/projected/7f12a9e8-fed6-4fe1-a4f0-fb9793707763-kube-api-access-6hmwq\") pod \"keda-admission-cf49989db-6zmp5\" (UID: \"7f12a9e8-fed6-4fe1-a4f0-fb9793707763\") " pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.180398 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.180390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f12a9e8-fed6-4fe1-a4f0-fb9793707763-certificates\") pod \"keda-admission-cf49989db-6zmp5\" (UID: \"7f12a9e8-fed6-4fe1-a4f0-fb9793707763\") " pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.182876 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.182857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f12a9e8-fed6-4fe1-a4f0-fb9793707763-certificates\") pod \"keda-admission-cf49989db-6zmp5\" (UID: \"7f12a9e8-fed6-4fe1-a4f0-fb9793707763\") " pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.192390 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.192347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmwq\" (UniqueName: \"kubernetes.io/projected/7f12a9e8-fed6-4fe1-a4f0-fb9793707763-kube-api-access-6hmwq\") pod \"keda-admission-cf49989db-6zmp5\" (UID: \"7f12a9e8-fed6-4fe1-a4f0-fb9793707763\") " pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.261694 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.261645 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:36.382859 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.382829 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:36.383022 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:36.382947 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:25:36.383022 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:36.382959 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:25:36.383022 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:36.382976 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8: references non-existent secret key: tls.crt Apr 24 21:25:36.383114 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:36.383038 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates podName:2508323d-2f83-44db-9bde-1caf102eba29 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:37.383023029 +0000 UTC m=+559.788864463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates") pod "keda-metrics-apiserver-7c9f485588-9bcc8" (UID: "2508323d-2f83-44db-9bde-1caf102eba29") : references non-existent secret key: tls.crt Apr 24 21:25:36.389688 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.389653 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6zmp5"] Apr 24 21:25:36.393920 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:25:36.393894 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f12a9e8_fed6_4fe1_a4f0_fb9793707763.slice/crio-fe7189fa8bca5e96ce3b9caad5937cad8eefb10393e3a4e2299b8365d838fd4a WatchSource:0}: Error finding container fe7189fa8bca5e96ce3b9caad5937cad8eefb10393e3a4e2299b8365d838fd4a: Status 404 returned error can't find the container with id fe7189fa8bca5e96ce3b9caad5937cad8eefb10393e3a4e2299b8365d838fd4a Apr 24 21:25:36.394972 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.394956 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:25:36.929239 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:36.929205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6zmp5" event={"ID":"7f12a9e8-fed6-4fe1-a4f0-fb9793707763","Type":"ContainerStarted","Data":"fe7189fa8bca5e96ce3b9caad5937cad8eefb10393e3a4e2299b8365d838fd4a"} Apr 24 21:25:37.393461 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:37.393419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:37.393664 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:37.393565 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:25:37.393664 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:37.393588 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:25:37.393664 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:37.393612 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8: references non-existent secret key: tls.crt Apr 24 21:25:37.393828 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:25:37.393677 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates podName:2508323d-2f83-44db-9bde-1caf102eba29 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:39.393657211 +0000 UTC m=+561.799498642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates") pod "keda-metrics-apiserver-7c9f485588-9bcc8" (UID: "2508323d-2f83-44db-9bde-1caf102eba29") : references non-existent secret key: tls.crt Apr 24 21:25:38.936484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:38.936393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6zmp5" event={"ID":"7f12a9e8-fed6-4fe1-a4f0-fb9793707763","Type":"ContainerStarted","Data":"314bfa834a526e89a71e60eb876e4bb765c744de1fe07684373006443255d59f"} Apr 24 21:25:38.936904 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:38.936522 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:25:38.954625 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:38.954577 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6zmp5" podStartSLOduration=1.706553391 podStartE2EDuration="3.954563952s" podCreationTimestamp="2026-04-24 21:25:35 +0000 UTC" firstStartedPulling="2026-04-24 21:25:36.395075546 +0000 UTC m=+558.800916975" lastFinishedPulling="2026-04-24 21:25:38.643086103 +0000 UTC m=+561.048927536" observedRunningTime="2026-04-24 21:25:38.953074475 +0000 UTC m=+561.358915947" watchObservedRunningTime="2026-04-24 21:25:38.954563952 +0000 UTC m=+561.360405402" Apr 24 21:25:39.411749 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:39.411714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:39.414254 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:39.414233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2508323d-2f83-44db-9bde-1caf102eba29-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9bcc8\" (UID: \"2508323d-2f83-44db-9bde-1caf102eba29\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:39.538718 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:39.538684 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:39.656365 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:39.656209 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8"] Apr 24 21:25:39.658304 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:25:39.658280 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2508323d_2f83_44db_9bde_1caf102eba29.slice/crio-0a6744c67ab36eede46511b2b94c9e93371c1afe00b832f2a6df10d63fc87243 WatchSource:0}: Error finding container 0a6744c67ab36eede46511b2b94c9e93371c1afe00b832f2a6df10d63fc87243: Status 404 returned error can't find the container with id 0a6744c67ab36eede46511b2b94c9e93371c1afe00b832f2a6df10d63fc87243 Apr 24 21:25:39.940654 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:39.940619 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" event={"ID":"2508323d-2f83-44db-9bde-1caf102eba29","Type":"ContainerStarted","Data":"0a6744c67ab36eede46511b2b94c9e93371c1afe00b832f2a6df10d63fc87243"} Apr 24 21:25:42.950481 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:42.950448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" event={"ID":"2508323d-2f83-44db-9bde-1caf102eba29","Type":"ContainerStarted","Data":"1e04243e61953420cd18b1760ab3d77b93e6b17f792c824ced1c2aec78434cc8"} Apr 24 21:25:42.950834 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:42.950496 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:42.970921 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:42.970873 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" podStartSLOduration=5.659789129 podStartE2EDuration="7.970859753s" podCreationTimestamp="2026-04-24 21:25:35 +0000 UTC" firstStartedPulling="2026-04-24 21:25:39.659656547 +0000 UTC m=+562.065497976" lastFinishedPulling="2026-04-24 21:25:41.970727165 +0000 UTC m=+564.376568600" observedRunningTime="2026-04-24 21:25:42.96941699 +0000 UTC m=+565.375258443" watchObservedRunningTime="2026-04-24 21:25:42.970859753 +0000 UTC m=+565.376701204" Apr 24 21:25:53.958569 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:53.958540 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9bcc8" Apr 24 21:25:59.943664 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:25:59.943632 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6zmp5" Apr 24 21:26:18.096478 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:18.096438 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:26:18.098090 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:18.098070 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:26:43.357585 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.357513 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bht2g"] Apr 24 21:26:43.360739 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.360722 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.363551 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.363533 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-kmd57\"" Apr 24 21:26:43.364422 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.364404 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:26:43.364422 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.364415 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:26:43.364566 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.364415 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:26:43.377934 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.377913 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bht2g"] Apr 24 21:26:43.502513 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.502475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-cert\") pod \"kserve-controller-manager-74fc8f6f96-bht2g\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.502683 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.502528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r748r\" (UniqueName: \"kubernetes.io/projected/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-kube-api-access-r748r\") pod \"kserve-controller-manager-74fc8f6f96-bht2g\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.603444 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.603397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-cert\") pod \"kserve-controller-manager-74fc8f6f96-bht2g\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.603444 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.603459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r748r\" (UniqueName: \"kubernetes.io/projected/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-kube-api-access-r748r\") pod \"kserve-controller-manager-74fc8f6f96-bht2g\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.605868 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.605848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-cert\") pod \"kserve-controller-manager-74fc8f6f96-bht2g\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.616635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.616582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r748r\" (UniqueName: \"kubernetes.io/projected/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-kube-api-access-r748r\") pod \"kserve-controller-manager-74fc8f6f96-bht2g\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.670295 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.670253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:43.797269 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:43.797239 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bht2g"] Apr 24 21:26:43.801119 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:26:43.801091 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a05be0_2a89_4f0c_8361_cb8a2a4325c0.slice/crio-8f0dcde926f155d18f3fd855286508b60e1bb9d0bbcf613a68e57f858b5663fa WatchSource:0}: Error finding container 8f0dcde926f155d18f3fd855286508b60e1bb9d0bbcf613a68e57f858b5663fa: Status 404 returned error can't find the container with id 8f0dcde926f155d18f3fd855286508b60e1bb9d0bbcf613a68e57f858b5663fa Apr 24 21:26:44.117914 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:44.117878 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" event={"ID":"71a05be0-2a89-4f0c-8361-cb8a2a4325c0","Type":"ContainerStarted","Data":"8f0dcde926f155d18f3fd855286508b60e1bb9d0bbcf613a68e57f858b5663fa"} Apr 24 21:26:47.129711 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:47.129675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" event={"ID":"71a05be0-2a89-4f0c-8361-cb8a2a4325c0","Type":"ContainerStarted","Data":"96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c"} Apr 24 21:26:47.130143 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:47.129736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:26:47.148486 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:26:47.148443 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" podStartSLOduration=1.714116517 podStartE2EDuration="4.148431976s" podCreationTimestamp="2026-04-24 21:26:43 +0000 UTC" firstStartedPulling="2026-04-24 21:26:43.802411036 +0000 UTC m=+626.208252466" lastFinishedPulling="2026-04-24 21:26:46.236726482 +0000 UTC m=+628.642567925" observedRunningTime="2026-04-24 21:26:47.147378025 +0000 UTC m=+629.553219473" watchObservedRunningTime="2026-04-24 21:26:47.148431976 +0000 UTC m=+629.554273427" Apr 24 21:27:18.138510 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:18.138479 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:27:19.955138 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:19.955103 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bht2g"] Apr 24 21:27:19.955596 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:19.955286 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" podUID="71a05be0-2a89-4f0c-8361-cb8a2a4325c0" containerName="manager" containerID="cri-o://96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c" gracePeriod=10 Apr 24 21:27:19.979718 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:19.979693 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-sb79s"] Apr 24 21:27:19.983026 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:19.983009 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:19.990233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:19.990210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-sb79s"] Apr 24 21:27:20.081026 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.080997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtml7\" (UniqueName: \"kubernetes.io/projected/639e7e62-db4e-4982-927c-bce6d3c3cee3-kube-api-access-wtml7\") pod \"kserve-controller-manager-74fc8f6f96-sb79s\" (UID: \"639e7e62-db4e-4982-927c-bce6d3c3cee3\") " pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.081161 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.081057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/639e7e62-db4e-4982-927c-bce6d3c3cee3-cert\") pod \"kserve-controller-manager-74fc8f6f96-sb79s\" (UID: \"639e7e62-db4e-4982-927c-bce6d3c3cee3\") " pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.181383 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.181350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/639e7e62-db4e-4982-927c-bce6d3c3cee3-cert\") pod \"kserve-controller-manager-74fc8f6f96-sb79s\" (UID: \"639e7e62-db4e-4982-927c-bce6d3c3cee3\") " pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.181497 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.181442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtml7\" (UniqueName: \"kubernetes.io/projected/639e7e62-db4e-4982-927c-bce6d3c3cee3-kube-api-access-wtml7\") pod \"kserve-controller-manager-74fc8f6f96-sb79s\" (UID: \"639e7e62-db4e-4982-927c-bce6d3c3cee3\") " pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.183803 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.183775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/639e7e62-db4e-4982-927c-bce6d3c3cee3-cert\") pod \"kserve-controller-manager-74fc8f6f96-sb79s\" (UID: \"639e7e62-db4e-4982-927c-bce6d3c3cee3\") " pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.184601 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.184583 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:27:20.189894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.189872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtml7\" (UniqueName: \"kubernetes.io/projected/639e7e62-db4e-4982-927c-bce6d3c3cee3-kube-api-access-wtml7\") pod \"kserve-controller-manager-74fc8f6f96-sb79s\" (UID: \"639e7e62-db4e-4982-927c-bce6d3c3cee3\") " pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.223810 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.223770 2573 generic.go:358] "Generic (PLEG): container finished" podID="71a05be0-2a89-4f0c-8361-cb8a2a4325c0" containerID="96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c" exitCode=0 Apr 24 21:27:20.223977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.223845 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" Apr 24 21:27:20.223977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.223853 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" event={"ID":"71a05be0-2a89-4f0c-8361-cb8a2a4325c0","Type":"ContainerDied","Data":"96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c"} Apr 24 21:27:20.223977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.223892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-bht2g" event={"ID":"71a05be0-2a89-4f0c-8361-cb8a2a4325c0","Type":"ContainerDied","Data":"8f0dcde926f155d18f3fd855286508b60e1bb9d0bbcf613a68e57f858b5663fa"} Apr 24 21:27:20.223977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.223913 2573 scope.go:117] "RemoveContainer" containerID="96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c" Apr 24 21:27:20.231209 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.231190 2573 scope.go:117] "RemoveContainer" containerID="96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c" Apr 24 21:27:20.231501 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:27:20.231483 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c\": container with ID starting with 96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c not found: ID does not exist" containerID="96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c" Apr 24 21:27:20.231576 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.231507 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c"} err="failed to get container status \"96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c\": rpc error: code = NotFound desc = could not find container \"96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c\": container with ID starting with 96356e5e025563388cecfd256935845a41d58fd8e682e3a4ae9f246b1e9fba3c not found: ID does not exist" Apr 24 21:27:20.281999 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.281961 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-cert\") pod \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " Apr 24 21:27:20.281999 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.282007 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r748r\" (UniqueName: \"kubernetes.io/projected/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-kube-api-access-r748r\") pod \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\" (UID: \"71a05be0-2a89-4f0c-8361-cb8a2a4325c0\") " Apr 24 21:27:20.284206 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.284179 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-cert" (OuterVolumeSpecName: "cert") pod "71a05be0-2a89-4f0c-8361-cb8a2a4325c0" (UID: "71a05be0-2a89-4f0c-8361-cb8a2a4325c0"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:27:20.284206 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.284194 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-kube-api-access-r748r" (OuterVolumeSpecName: "kube-api-access-r748r") pod "71a05be0-2a89-4f0c-8361-cb8a2a4325c0" (UID: "71a05be0-2a89-4f0c-8361-cb8a2a4325c0"). InnerVolumeSpecName "kube-api-access-r748r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:27:20.343092 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.343057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:20.383160 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.383129 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-cert\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:27:20.383160 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.383159 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r748r\" (UniqueName: \"kubernetes.io/projected/71a05be0-2a89-4f0c-8361-cb8a2a4325c0-kube-api-access-r748r\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:27:20.464716 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.464672 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-sb79s"] Apr 24 21:27:20.467327 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:27:20.467300 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod639e7e62_db4e_4982_927c_bce6d3c3cee3.slice/crio-cbaf0be9e3a036b0a22c80d57d9fc73fa789524c10acd83f8d81c9adb746769e WatchSource:0}: Error finding container cbaf0be9e3a036b0a22c80d57d9fc73fa789524c10acd83f8d81c9adb746769e: Status 404 returned error can't find the container with id cbaf0be9e3a036b0a22c80d57d9fc73fa789524c10acd83f8d81c9adb746769e Apr 24 21:27:20.549920 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.549887 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bht2g"] Apr 24 21:27:20.555431 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:20.555405 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bht2g"] Apr 24 21:27:21.231563 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:21.231525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" event={"ID":"639e7e62-db4e-4982-927c-bce6d3c3cee3","Type":"ContainerStarted","Data":"e853c8df7f9010140e9f58b4e461712263858fd79c8cb55e18da76ccdc319780"} Apr 24 21:27:21.231563 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:21.231562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" event={"ID":"639e7e62-db4e-4982-927c-bce6d3c3cee3","Type":"ContainerStarted","Data":"cbaf0be9e3a036b0a22c80d57d9fc73fa789524c10acd83f8d81c9adb746769e"} Apr 24 21:27:21.231968 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:21.231593 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:21.248092 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:21.248047 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" podStartSLOduration=1.9050527800000001 podStartE2EDuration="2.248036213s" podCreationTimestamp="2026-04-24 21:27:19 +0000 UTC" firstStartedPulling="2026-04-24 21:27:20.468574203 +0000 UTC m=+662.874415636" lastFinishedPulling="2026-04-24 21:27:20.811557622 +0000 UTC m=+663.217399069" observedRunningTime="2026-04-24 21:27:21.247420937 +0000 UTC m=+663.653262422" watchObservedRunningTime="2026-04-24 21:27:21.248036213 +0000 UTC m=+663.653877664" Apr 24 21:27:22.210883 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:22.210849 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a05be0-2a89-4f0c-8361-cb8a2a4325c0" path="/var/lib/kubelet/pods/71a05be0-2a89-4f0c-8361-cb8a2a4325c0/volumes" Apr 24 21:27:52.238267 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:52.238237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-sb79s" Apr 24 21:27:53.126864 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.126834 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-nplhq"] Apr 24 21:27:53.127129 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.127117 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71a05be0-2a89-4f0c-8361-cb8a2a4325c0" containerName="manager" Apr 24 21:27:53.127184 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.127130 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a05be0-2a89-4f0c-8361-cb8a2a4325c0" containerName="manager" Apr 24 21:27:53.127184 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.127182 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="71a05be0-2a89-4f0c-8361-cb8a2a4325c0" containerName="manager" Apr 24 21:27:53.129979 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.129960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.132476 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.132457 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:27:53.132593 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.132516 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-x5v6d\"" Apr 24 21:27:53.139846 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.139815 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nplhq"] Apr 24 21:27:53.148015 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.147984 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6svcd"] Apr 24 21:27:53.152569 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.152541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.154975 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.154955 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-g64dr\"" Apr 24 21:27:53.155133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.155116 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:27:53.158936 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.158913 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6svcd"] Apr 24 21:27:53.237223 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.237168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgb5\" (UniqueName: \"kubernetes.io/projected/d7ce6ad8-5c35-44df-b56f-728edbc122a6-kube-api-access-mrgb5\") pod \"model-serving-api-86f7b4b499-nplhq\" (UID: \"d7ce6ad8-5c35-44df-b56f-728edbc122a6\") " pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.237223 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.237218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ea0554-ebe2-401f-bfcd-ef36fd9ee74d-cert\") pod \"odh-model-controller-696fc77849-6svcd\" (UID: \"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d\") " pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.237459 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.237338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7ce6ad8-5c35-44df-b56f-728edbc122a6-tls-certs\") pod \"model-serving-api-86f7b4b499-nplhq\" (UID: \"d7ce6ad8-5c35-44df-b56f-728edbc122a6\") " pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.237459 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.237421 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwc6k\" (UniqueName: \"kubernetes.io/projected/66ea0554-ebe2-401f-bfcd-ef36fd9ee74d-kube-api-access-dwc6k\") pod \"odh-model-controller-696fc77849-6svcd\" (UID: \"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d\") " pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.338591 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.338551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7ce6ad8-5c35-44df-b56f-728edbc122a6-tls-certs\") pod \"model-serving-api-86f7b4b499-nplhq\" (UID: \"d7ce6ad8-5c35-44df-b56f-728edbc122a6\") " pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.339068 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.338616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwc6k\" (UniqueName: \"kubernetes.io/projected/66ea0554-ebe2-401f-bfcd-ef36fd9ee74d-kube-api-access-dwc6k\") pod \"odh-model-controller-696fc77849-6svcd\" (UID: \"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d\") " pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.339068 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.338648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgb5\" (UniqueName: \"kubernetes.io/projected/d7ce6ad8-5c35-44df-b56f-728edbc122a6-kube-api-access-mrgb5\") pod \"model-serving-api-86f7b4b499-nplhq\" (UID: \"d7ce6ad8-5c35-44df-b56f-728edbc122a6\") " pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.339068 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.338670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ea0554-ebe2-401f-bfcd-ef36fd9ee74d-cert\") pod \"odh-model-controller-696fc77849-6svcd\" (UID: \"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d\") " pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.341019 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.340995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7ce6ad8-5c35-44df-b56f-728edbc122a6-tls-certs\") pod \"model-serving-api-86f7b4b499-nplhq\" (UID: \"d7ce6ad8-5c35-44df-b56f-728edbc122a6\") " pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.341116 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.341022 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ea0554-ebe2-401f-bfcd-ef36fd9ee74d-cert\") pod \"odh-model-controller-696fc77849-6svcd\" (UID: \"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d\") " pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.347262 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.347241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwc6k\" (UniqueName: \"kubernetes.io/projected/66ea0554-ebe2-401f-bfcd-ef36fd9ee74d-kube-api-access-dwc6k\") pod \"odh-model-controller-696fc77849-6svcd\" (UID: \"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d\") " pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.347262 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.347251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgb5\" (UniqueName: \"kubernetes.io/projected/d7ce6ad8-5c35-44df-b56f-728edbc122a6-kube-api-access-mrgb5\") pod \"model-serving-api-86f7b4b499-nplhq\" (UID: \"d7ce6ad8-5c35-44df-b56f-728edbc122a6\") " pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.440073 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.439984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:53.462775 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.462740 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:53.567482 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.567433 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nplhq"] Apr 24 21:27:53.569825 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:27:53.569795 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ce6ad8_5c35_44df_b56f_728edbc122a6.slice/crio-6e84a02c833f8bf3832177c3148ed05330536e1047aa5d7a431263c7bca4f4d0 WatchSource:0}: Error finding container 6e84a02c833f8bf3832177c3148ed05330536e1047aa5d7a431263c7bca4f4d0: Status 404 returned error can't find the container with id 6e84a02c833f8bf3832177c3148ed05330536e1047aa5d7a431263c7bca4f4d0 Apr 24 21:27:53.593274 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:53.593251 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6svcd"] Apr 24 21:27:53.595733 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:27:53.595704 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ea0554_ebe2_401f_bfcd_ef36fd9ee74d.slice/crio-86b688254b9605941f9054851a2362e88207984d574df69b15d115bb83950650 WatchSource:0}: Error finding container 86b688254b9605941f9054851a2362e88207984d574df69b15d115bb83950650: Status 404 returned error can't find the container with id 86b688254b9605941f9054851a2362e88207984d574df69b15d115bb83950650 Apr 24 21:27:54.325640 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:54.325565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6svcd" event={"ID":"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d","Type":"ContainerStarted","Data":"86b688254b9605941f9054851a2362e88207984d574df69b15d115bb83950650"} Apr 24 21:27:54.326948 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:54.326899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nplhq" event={"ID":"d7ce6ad8-5c35-44df-b56f-728edbc122a6","Type":"ContainerStarted","Data":"6e84a02c833f8bf3832177c3148ed05330536e1047aa5d7a431263c7bca4f4d0"} Apr 24 21:27:56.334917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:56.334852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nplhq" event={"ID":"d7ce6ad8-5c35-44df-b56f-728edbc122a6","Type":"ContainerStarted","Data":"604e6e835af7b1c67b9b3a7001ba310b2187d762f13187c0c130744524b13d03"} Apr 24 21:27:56.334917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:56.334917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:27:56.336236 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:56.336214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6svcd" event={"ID":"66ea0554-ebe2-401f-bfcd-ef36fd9ee74d","Type":"ContainerStarted","Data":"8c0da89bc46e60ebb836cd9ed6d7a8d736c904fbb96ed0e0826c7c16040584d9"} Apr 24 21:27:56.336332 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:56.336319 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:27:56.354367 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:56.354310 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-nplhq" podStartSLOduration=0.725448081 podStartE2EDuration="3.354295244s" podCreationTimestamp="2026-04-24 21:27:53 +0000 UTC" firstStartedPulling="2026-04-24 21:27:53.571754365 +0000 UTC m=+695.977595800" lastFinishedPulling="2026-04-24 21:27:56.20060153 +0000 UTC m=+698.606442963" observedRunningTime="2026-04-24 21:27:56.352022277 +0000 UTC m=+698.757863727" watchObservedRunningTime="2026-04-24 21:27:56.354295244 +0000 UTC m=+698.760136695" Apr 24 21:27:56.370443 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:27:56.370396 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6svcd" podStartSLOduration=0.716265412 podStartE2EDuration="3.370382331s" podCreationTimestamp="2026-04-24 21:27:53 +0000 UTC" firstStartedPulling="2026-04-24 21:27:53.596993958 +0000 UTC m=+696.002835388" lastFinishedPulling="2026-04-24 21:27:56.251110875 +0000 UTC m=+698.656952307" observedRunningTime="2026-04-24 21:27:56.368069406 +0000 UTC m=+698.773910878" watchObservedRunningTime="2026-04-24 21:27:56.370382331 +0000 UTC m=+698.776223781" Apr 24 21:28:07.341376 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:07.341279 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6svcd" Apr 24 21:28:07.343221 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:07.343202 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-nplhq" Apr 24 21:28:28.740052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.740018 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2"] Apr 24 21:28:28.746901 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.746877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.749434 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.749410 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d41b1-predictor-serving-cert\"" Apr 24 21:28:28.749434 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.749429 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d41b1-kube-rbac-proxy-sar-config\"" Apr 24 21:28:28.750273 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.750257 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:28:28.750337 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.750257 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xkfdn\"" Apr 24 21:28:28.750337 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.750303 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:28:28.756130 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.756107 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2"] Apr 24 21:28:28.828823 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.828794 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mhw\" (UniqueName: \"kubernetes.io/projected/7b76ea09-17c8-405a-affa-0054342b9b15-kube-api-access-54mhw\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.828960 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.828833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b76ea09-17c8-405a-affa-0054342b9b15-proxy-tls\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.828960 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.828924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b76ea09-17c8-405a-affa-0054342b9b15-success-200-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.930076 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.930048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b76ea09-17c8-405a-affa-0054342b9b15-success-200-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.930239 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.930098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54mhw\" (UniqueName: \"kubernetes.io/projected/7b76ea09-17c8-405a-affa-0054342b9b15-kube-api-access-54mhw\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.930239 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.930124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b76ea09-17c8-405a-affa-0054342b9b15-proxy-tls\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.930704 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.930679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b76ea09-17c8-405a-affa-0054342b9b15-success-200-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.932794 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.932770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b76ea09-17c8-405a-affa-0054342b9b15-proxy-tls\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:28.938818 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:28.938795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mhw\" (UniqueName: \"kubernetes.io/projected/7b76ea09-17c8-405a-affa-0054342b9b15-kube-api-access-54mhw\") pod \"success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:29.057943 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.057873 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:29.186028 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.186005 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2"] Apr 24 21:28:29.188868 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:28:29.188835 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b76ea09_17c8_405a_affa_0054342b9b15.slice/crio-e5181fe3033d051f918313cd0c99a40d398e9721db8fdf4b8da67ceebdf87dfb WatchSource:0}: Error finding container e5181fe3033d051f918313cd0c99a40d398e9721db8fdf4b8da67ceebdf87dfb: Status 404 returned error can't find the container with id e5181fe3033d051f918313cd0c99a40d398e9721db8fdf4b8da67ceebdf87dfb Apr 24 21:28:29.391723 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.391633 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4"] Apr 24 21:28:29.396646 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.396624 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.398901 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.398874 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 24 21:28:29.399577 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.399319 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 24 21:28:29.410745 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.410718 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4"] Apr 24 21:28:29.431093 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.431067 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" event={"ID":"7b76ea09-17c8-405a-affa-0054342b9b15","Type":"ContainerStarted","Data":"e5181fe3033d051f918313cd0c99a40d398e9721db8fdf4b8da67ceebdf87dfb"} Apr 24 21:28:29.434537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.434513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.434597 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.434558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bkcf\" (UniqueName: \"kubernetes.io/projected/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kube-api-access-9bkcf\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.434643 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.434629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45ff4ebd-30bb-4cf0-984b-e93e5a155663-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.434698 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.434683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45ff4ebd-30bb-4cf0-984b-e93e5a155663-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.535551 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.535519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45ff4ebd-30bb-4cf0-984b-e93e5a155663-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.535551 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.535555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45ff4ebd-30bb-4cf0-984b-e93e5a155663-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.535728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.535611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.535728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.535632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bkcf\" (UniqueName: \"kubernetes.io/projected/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kube-api-access-9bkcf\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.536041 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.536015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.536257 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.536239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45ff4ebd-30bb-4cf0-984b-e93e5a155663-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.537971 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.537951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45ff4ebd-30bb-4cf0-984b-e93e5a155663-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.543843 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.543821 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bkcf\" (UniqueName: \"kubernetes.io/projected/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kube-api-access-9bkcf\") pod \"isvc-xgboost-graph-predictor-669d8d6456-vw4f4\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.711971 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.711467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:28:29.898728 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:29.898673 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4"] Apr 24 21:28:29.916177 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:28:29.916125 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ff4ebd_30bb_4cf0_984b_e93e5a155663.slice/crio-8561b46152be5e1eeb560bfeb72c67767fec01ae644e67978a8701d22e43acf4 WatchSource:0}: Error finding container 8561b46152be5e1eeb560bfeb72c67767fec01ae644e67978a8701d22e43acf4: Status 404 returned error can't find the container with id 8561b46152be5e1eeb560bfeb72c67767fec01ae644e67978a8701d22e43acf4 Apr 24 21:28:30.440702 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:30.440612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerStarted","Data":"8561b46152be5e1eeb560bfeb72c67767fec01ae644e67978a8701d22e43acf4"} Apr 24 21:28:42.484557 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:42.484510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerStarted","Data":"8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0"} Apr 24 21:28:42.485833 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:42.485809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" event={"ID":"7b76ea09-17c8-405a-affa-0054342b9b15","Type":"ContainerStarted","Data":"3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a"} Apr 24 21:28:45.497754 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:45.497656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" event={"ID":"7b76ea09-17c8-405a-affa-0054342b9b15","Type":"ContainerStarted","Data":"c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4"} Apr 24 21:28:45.498131 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:45.497902 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:45.498131 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:45.498023 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:45.499500 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:45.499459 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 21:28:45.516245 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:45.516197 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podStartSLOduration=2.042688505 podStartE2EDuration="17.516184081s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:29.192922964 +0000 UTC m=+731.598764393" lastFinishedPulling="2026-04-24 21:28:44.666418537 +0000 UTC m=+747.072259969" observedRunningTime="2026-04-24 21:28:45.515047348 +0000 UTC m=+747.920888810" watchObservedRunningTime="2026-04-24 21:28:45.516184081 +0000 UTC m=+747.922025534" Apr 24 21:28:46.502038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:46.502005 2573 generic.go:358] "Generic (PLEG): container finished" podID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerID="8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0" exitCode=0 Apr 24 21:28:46.502492 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:46.502076 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerDied","Data":"8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0"} Apr 24 21:28:46.502492 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:46.502466 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 21:28:51.507222 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:51.507194 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:28:51.507805 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:28:51.507777 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 21:29:01.507852 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:01.507798 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 21:29:06.567204 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:06.567171 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerStarted","Data":"065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9"} Apr 24 21:29:06.567204 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:06.567212 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerStarted","Data":"e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604"} Apr 24 21:29:06.589283 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:06.589238 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podStartSLOduration=1.676435363 podStartE2EDuration="37.589223947s" podCreationTimestamp="2026-04-24 21:28:29 +0000 UTC" firstStartedPulling="2026-04-24 21:28:29.919810474 +0000 UTC m=+732.325651909" lastFinishedPulling="2026-04-24 21:29:05.832599061 +0000 UTC m=+768.238440493" observedRunningTime="2026-04-24 21:29:06.588565577 +0000 UTC m=+768.994407024" watchObservedRunningTime="2026-04-24 21:29:06.589223947 +0000 UTC m=+768.995065397" Apr 24 21:29:11.508599 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:11.508558 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 21:29:11.568501 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:11.568460 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:29:11.568957 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:11.568931 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:29:11.570185 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:11.570150 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:11.574048 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:11.574030 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:29:11.581844 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:11.581817 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:12.584514 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:12.584472 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:21.508532 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:21.508493 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 21:29:22.584437 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:22.584392 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:31.509150 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:31.509116 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:29:32.585397 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:32.585329 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:42.585484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:42.585395 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:48.743218 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.743179 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v"] Apr 24 21:29:48.746665 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.746643 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:48.749038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.749010 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d41b1-serving-cert\"" Apr 24 21:29:48.749038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.749020 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d41b1-kube-rbac-proxy-sar-config\"" Apr 24 21:29:48.753539 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.753500 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v"] Apr 24 21:29:48.814042 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.814005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:48.814203 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.814121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/869d7610-9d15-4439-b9fb-dfe142a340d6-openshift-service-ca-bundle\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:48.915150 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.915102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:48.915350 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.915190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/869d7610-9d15-4439-b9fb-dfe142a340d6-openshift-service-ca-bundle\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:48.915350 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:29:48.915261 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-d41b1-serving-cert: secret "switch-graph-d41b1-serving-cert" not found Apr 24 21:29:48.915350 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:29:48.915346 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls podName:869d7610-9d15-4439-b9fb-dfe142a340d6 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:49.415326695 +0000 UTC m=+811.821168136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls") pod "switch-graph-d41b1-655d9c9d4c-sgp2v" (UID: "869d7610-9d15-4439-b9fb-dfe142a340d6") : secret "switch-graph-d41b1-serving-cert" not found Apr 24 21:29:48.915922 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:48.915898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/869d7610-9d15-4439-b9fb-dfe142a340d6-openshift-service-ca-bundle\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:49.419059 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:49.419023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:49.421553 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:49.421520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls\") pod \"switch-graph-d41b1-655d9c9d4c-sgp2v\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:49.657677 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:49.657630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:49.777888 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:49.777862 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v"] Apr 24 21:29:49.780635 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:29:49.780602 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869d7610_9d15_4439_b9fb_dfe142a340d6.slice/crio-aae12265789a49e485328c921ed5b316efd45bf53fefe8dcbc8cefc0ca3efc90 WatchSource:0}: Error finding container aae12265789a49e485328c921ed5b316efd45bf53fefe8dcbc8cefc0ca3efc90: Status 404 returned error can't find the container with id aae12265789a49e485328c921ed5b316efd45bf53fefe8dcbc8cefc0ca3efc90 Apr 24 21:29:50.698649 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:50.698613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" event={"ID":"869d7610-9d15-4439-b9fb-dfe142a340d6","Type":"ContainerStarted","Data":"aae12265789a49e485328c921ed5b316efd45bf53fefe8dcbc8cefc0ca3efc90"} Apr 24 21:29:52.585018 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:52.584977 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:29:52.706285 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:52.706253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" event={"ID":"869d7610-9d15-4439-b9fb-dfe142a340d6","Type":"ContainerStarted","Data":"b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c"} Apr 24 21:29:52.706459 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:52.706385 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:29:52.729578 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:52.729533 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podStartSLOduration=2.734072824 podStartE2EDuration="4.729518486s" podCreationTimestamp="2026-04-24 21:29:48 +0000 UTC" firstStartedPulling="2026-04-24 21:29:49.782180607 +0000 UTC m=+812.188022037" lastFinishedPulling="2026-04-24 21:29:51.777626267 +0000 UTC m=+814.183467699" observedRunningTime="2026-04-24 21:29:52.72874703 +0000 UTC m=+815.134588481" watchObservedRunningTime="2026-04-24 21:29:52.729518486 +0000 UTC m=+815.135359937" Apr 24 21:29:58.715841 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:29:58.715812 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:30:02.585164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:02.585121 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:30:02.853165 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:02.853081 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v"] Apr 24 21:30:02.853420 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:02.853390 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" containerID="cri-o://b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c" gracePeriod=30 Apr 24 21:30:02.999187 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:02.999153 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2"] Apr 24 21:30:02.999534 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:02.999488 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" containerID="cri-o://3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a" gracePeriod=30 Apr 24 21:30:02.999644 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:02.999533 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kube-rbac-proxy" containerID="cri-o://c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4" gracePeriod=30 Apr 24 21:30:03.196635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.196601 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8"] Apr 24 21:30:03.200083 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.200065 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.202330 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.202302 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b868d-predictor-serving-cert\"" Apr 24 21:30:03.202529 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.202392 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b868d-kube-rbac-proxy-sar-config\"" Apr 24 21:30:03.207848 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.207825 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8"] Apr 24 21:30:03.334684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.334647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p52vb\" (UniqueName: \"kubernetes.io/projected/1b444c97-fd1e-4685-8d79-0790643b0ed3-kube-api-access-p52vb\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.334684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.334689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b444c97-fd1e-4685-8d79-0790643b0ed3-success-200-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.334894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.334753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b444c97-fd1e-4685-8d79-0790643b0ed3-proxy-tls\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.435678 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.435572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b444c97-fd1e-4685-8d79-0790643b0ed3-success-200-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.435678 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.435632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b444c97-fd1e-4685-8d79-0790643b0ed3-proxy-tls\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.435922 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.435731 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p52vb\" (UniqueName: \"kubernetes.io/projected/1b444c97-fd1e-4685-8d79-0790643b0ed3-kube-api-access-p52vb\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.436329 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.436306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b444c97-fd1e-4685-8d79-0790643b0ed3-success-200-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.438000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.437979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b444c97-fd1e-4685-8d79-0790643b0ed3-proxy-tls\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.445072 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.445051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p52vb\" (UniqueName: \"kubernetes.io/projected/1b444c97-fd1e-4685-8d79-0790643b0ed3-kube-api-access-p52vb\") pod \"success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.512719 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.512679 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:03.641101 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.641079 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8"] Apr 24 21:30:03.643172 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:30:03.643140 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b444c97_fd1e_4685_8d79_0790643b0ed3.slice/crio-f25caf9a817685abd8e50867f6bac08b318df7c91c25ccd07ac9f7c9d7aacc45 WatchSource:0}: Error finding container f25caf9a817685abd8e50867f6bac08b318df7c91c25ccd07ac9f7c9d7aacc45: Status 404 returned error can't find the container with id f25caf9a817685abd8e50867f6bac08b318df7c91c25ccd07ac9f7c9d7aacc45 Apr 24 21:30:03.713600 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.713563 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:03.742433 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.742391 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" event={"ID":"1b444c97-fd1e-4685-8d79-0790643b0ed3","Type":"ContainerStarted","Data":"913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780"} Apr 24 21:30:03.742550 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.742438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" event={"ID":"1b444c97-fd1e-4685-8d79-0790643b0ed3","Type":"ContainerStarted","Data":"f25caf9a817685abd8e50867f6bac08b318df7c91c25ccd07ac9f7c9d7aacc45"} Apr 24 21:30:03.743971 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.743943 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b76ea09-17c8-405a-affa-0054342b9b15" containerID="c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4" exitCode=2 Apr 24 21:30:03.744068 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:03.743999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" event={"ID":"7b76ea09-17c8-405a-affa-0054342b9b15","Type":"ContainerDied","Data":"c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4"} Apr 24 21:30:04.749120 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:04.749081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" event={"ID":"1b444c97-fd1e-4685-8d79-0790643b0ed3","Type":"ContainerStarted","Data":"14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052"} Apr 24 21:30:04.749608 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:04.749379 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:04.749608 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:04.749505 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:04.750911 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:04.750885 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:30:04.767479 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:04.767434 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podStartSLOduration=1.76742124 podStartE2EDuration="1.76742124s" podCreationTimestamp="2026-04-24 21:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:04.765959026 +0000 UTC m=+827.171800492" watchObservedRunningTime="2026-04-24 21:30:04.76742124 +0000 UTC m=+827.173262690" Apr 24 21:30:05.752603 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:05.752568 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:30:06.548222 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.548200 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:30:06.663741 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.663654 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b76ea09-17c8-405a-affa-0054342b9b15-proxy-tls\") pod \"7b76ea09-17c8-405a-affa-0054342b9b15\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " Apr 24 21:30:06.663741 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.663728 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mhw\" (UniqueName: \"kubernetes.io/projected/7b76ea09-17c8-405a-affa-0054342b9b15-kube-api-access-54mhw\") pod \"7b76ea09-17c8-405a-affa-0054342b9b15\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " Apr 24 21:30:06.663977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.663792 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b76ea09-17c8-405a-affa-0054342b9b15-success-200-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"7b76ea09-17c8-405a-affa-0054342b9b15\" (UID: \"7b76ea09-17c8-405a-affa-0054342b9b15\") " Apr 24 21:30:06.664178 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.664148 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b76ea09-17c8-405a-affa-0054342b9b15-success-200-isvc-d41b1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d41b1-kube-rbac-proxy-sar-config") pod "7b76ea09-17c8-405a-affa-0054342b9b15" (UID: "7b76ea09-17c8-405a-affa-0054342b9b15"). InnerVolumeSpecName "success-200-isvc-d41b1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:06.665909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.665886 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b76ea09-17c8-405a-affa-0054342b9b15-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b76ea09-17c8-405a-affa-0054342b9b15" (UID: "7b76ea09-17c8-405a-affa-0054342b9b15"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:06.665991 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.665925 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b76ea09-17c8-405a-affa-0054342b9b15-kube-api-access-54mhw" (OuterVolumeSpecName: "kube-api-access-54mhw") pod "7b76ea09-17c8-405a-affa-0054342b9b15" (UID: "7b76ea09-17c8-405a-affa-0054342b9b15"). InnerVolumeSpecName "kube-api-access-54mhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:06.757088 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.757055 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b76ea09-17c8-405a-affa-0054342b9b15" containerID="3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a" exitCode=0 Apr 24 21:30:06.757466 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.757133 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" Apr 24 21:30:06.757466 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.757133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" event={"ID":"7b76ea09-17c8-405a-affa-0054342b9b15","Type":"ContainerDied","Data":"3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a"} Apr 24 21:30:06.757466 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.757234 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" event={"ID":"7b76ea09-17c8-405a-affa-0054342b9b15","Type":"ContainerDied","Data":"e5181fe3033d051f918313cd0c99a40d398e9721db8fdf4b8da67ceebdf87dfb"} Apr 24 21:30:06.757466 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.757249 2573 scope.go:117] "RemoveContainer" containerID="c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4" Apr 24 21:30:06.764664 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.764642 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b76ea09-17c8-405a-affa-0054342b9b15-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.764736 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.764673 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54mhw\" (UniqueName: \"kubernetes.io/projected/7b76ea09-17c8-405a-affa-0054342b9b15-kube-api-access-54mhw\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.764736 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.764688 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b76ea09-17c8-405a-affa-0054342b9b15-success-200-isvc-d41b1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.765318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.765298 2573 scope.go:117] "RemoveContainer" containerID="3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a" Apr 24 21:30:06.771990 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.771973 2573 scope.go:117] "RemoveContainer" containerID="c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4" Apr 24 21:30:06.772230 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:06.772211 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4\": container with ID starting with c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4 not found: ID does not exist" containerID="c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4" Apr 24 21:30:06.772298 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.772242 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4"} err="failed to get container status \"c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4\": rpc error: code = NotFound desc = could not find container \"c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4\": container with ID starting with c402f10dee015cb7d1eb8b9ef2748211bd2a7fe9de6f0f143f9bc57007b8edc4 not found: ID does not exist" Apr 24 21:30:06.772298 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.772265 2573 scope.go:117] "RemoveContainer" containerID="3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a" Apr 24 21:30:06.772518 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:06.772498 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a\": container with ID starting with 3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a not found: ID does not exist" containerID="3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a" Apr 24 21:30:06.772561 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.772525 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a"} err="failed to get container status \"3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a\": rpc error: code = NotFound desc = could not find container \"3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a\": container with ID starting with 3cef126522ccc41231db2955c8c28d998857c04b175c20dfdab59d19ac56586a not found: ID does not exist" Apr 24 21:30:06.777749 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.777729 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2"] Apr 24 21:30:06.781932 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:06.781914 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2"] Apr 24 21:30:07.502997 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:07.502948 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.27:8643/healthz\": context deadline exceeded" Apr 24 21:30:08.210462 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:08.210429 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" path="/var/lib/kubelet/pods/7b76ea09-17c8-405a-affa-0054342b9b15/volumes" Apr 24 21:30:08.713857 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:08.713809 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:10.757090 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:10.757061 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:10.757651 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:10.757624 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:30:12.585450 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:12.585419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:30:13.713945 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:13.713904 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:13.714385 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:13.714028 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:30:18.713880 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:18.713841 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:20.758197 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:20.758154 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:30:23.713479 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:23.713433 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:28.714085 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.714045 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:28.767803 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.767762 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x"] Apr 24 21:30:28.768123 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.768111 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kube-rbac-proxy" Apr 24 21:30:28.768164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.768125 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kube-rbac-proxy" Apr 24 21:30:28.768164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.768138 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" Apr 24 21:30:28.768164 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.768144 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" Apr 24 21:30:28.768256 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.768197 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kserve-container" Apr 24 21:30:28.768256 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.768207 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b76ea09-17c8-405a-affa-0054342b9b15" containerName="kube-rbac-proxy" Apr 24 21:30:28.773810 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.773790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:28.776285 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.776254 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 24 21:30:28.776437 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.776294 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 24 21:30:28.779243 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.779218 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x"] Apr 24 21:30:28.834922 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.834884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:28.835076 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.834975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-openshift-service-ca-bundle\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:28.936119 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.936079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:28.936310 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.936132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-openshift-service-ca-bundle\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:28.936310 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:28.936218 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 24 21:30:28.936310 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:28.936279 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls podName:9ddb90b6-48de-4a26-87be-bafd5c0a59aa nodeName:}" failed. No retries permitted until 2026-04-24 21:30:29.436264404 +0000 UTC m=+851.842105834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls") pod "model-chainer-bfd66c7dd-dqj6x" (UID: "9ddb90b6-48de-4a26-87be-bafd5c0a59aa") : secret "model-chainer-serving-cert" not found Apr 24 21:30:28.936757 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:28.936740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-openshift-service-ca-bundle\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:29.440331 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:29.440280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:29.442656 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:29.442629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls\") pod \"model-chainer-bfd66c7dd-dqj6x\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:29.685057 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:29.685020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:29.801537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:29.801504 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x"] Apr 24 21:30:29.831815 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:29.831780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" event={"ID":"9ddb90b6-48de-4a26-87be-bafd5c0a59aa","Type":"ContainerStarted","Data":"37a754b5eb08516350f14fa50855d0446403cb33e9588e4b1df6c495a8821796"} Apr 24 21:30:30.757964 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:30.757923 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:30:30.836244 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:30.836204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" event={"ID":"9ddb90b6-48de-4a26-87be-bafd5c0a59aa","Type":"ContainerStarted","Data":"f9c676c41a73d8dd0795291ca7b4d7b0e52234a190c3d3e357c168d8c7cf731a"} Apr 24 21:30:30.836648 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:30.836347 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:30.852229 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:30.852183 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podStartSLOduration=2.852169153 podStartE2EDuration="2.852169153s" podCreationTimestamp="2026-04-24 21:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:30.850950155 +0000 UTC m=+853.256791604" watchObservedRunningTime="2026-04-24 21:30:30.852169153 +0000 UTC m=+853.258010603" Apr 24 21:30:32.931065 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:32.931018 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869d7610_9d15_4439_b9fb_dfe142a340d6.slice/crio-conmon-b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869d7610_9d15_4439_b9fb_dfe142a340d6.slice/crio-b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:30:33.041602 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.041580 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:30:33.067518 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.067478 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls\") pod \"869d7610-9d15-4439-b9fb-dfe142a340d6\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " Apr 24 21:30:33.067679 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.067543 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/869d7610-9d15-4439-b9fb-dfe142a340d6-openshift-service-ca-bundle\") pod \"869d7610-9d15-4439-b9fb-dfe142a340d6\" (UID: \"869d7610-9d15-4439-b9fb-dfe142a340d6\") " Apr 24 21:30:33.067928 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.067898 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7610-9d15-4439-b9fb-dfe142a340d6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "869d7610-9d15-4439-b9fb-dfe142a340d6" (UID: "869d7610-9d15-4439-b9fb-dfe142a340d6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:33.069606 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.069580 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "869d7610-9d15-4439-b9fb-dfe142a340d6" (UID: "869d7610-9d15-4439-b9fb-dfe142a340d6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:33.168627 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.168536 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/869d7610-9d15-4439-b9fb-dfe142a340d6-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:33.168627 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.168573 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869d7610-9d15-4439-b9fb-dfe142a340d6-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:33.847293 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.847254 2573 generic.go:358] "Generic (PLEG): container finished" podID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerID="b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c" exitCode=137 Apr 24 21:30:33.847591 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.847315 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" Apr 24 21:30:33.847591 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.847340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" event={"ID":"869d7610-9d15-4439-b9fb-dfe142a340d6","Type":"ContainerDied","Data":"b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c"} Apr 24 21:30:33.847591 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.847387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v" event={"ID":"869d7610-9d15-4439-b9fb-dfe142a340d6","Type":"ContainerDied","Data":"aae12265789a49e485328c921ed5b316efd45bf53fefe8dcbc8cefc0ca3efc90"} Apr 24 21:30:33.847591 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.847402 2573 scope.go:117] "RemoveContainer" containerID="b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c" Apr 24 21:30:33.855409 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.855387 2573 scope.go:117] "RemoveContainer" containerID="b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c" Apr 24 21:30:33.855673 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:33.855646 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c\": container with ID starting with b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c not found: ID does not exist" containerID="b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c" Apr 24 21:30:33.855730 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.855685 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c"} err="failed to get container status \"b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c\": rpc error: code = NotFound desc = could not find container \"b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c\": container with ID starting with b42972bf67168d55620fd5e35686264150ae49ceb3da7291f73145823f74742c not found: ID does not exist" Apr 24 21:30:33.868143 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.868116 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v"] Apr 24 21:30:33.871160 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:33.871139 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v"] Apr 24 21:30:34.213326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:34.213249 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" path="/var/lib/kubelet/pods/869d7610-9d15-4439-b9fb-dfe142a340d6/volumes" Apr 24 21:30:36.845637 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:36.845608 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:38.848210 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:38.848177 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x"] Apr 24 21:30:38.848626 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:38.848480 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" containerID="cri-o://f9c676c41a73d8dd0795291ca7b4d7b0e52234a190c3d3e357c168d8c7cf731a" gracePeriod=30 Apr 24 21:30:38.923065 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:38.923020 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4"] Apr 24 21:30:38.923380 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:38.923338 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" containerID="cri-o://e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604" gracePeriod=30 Apr 24 21:30:38.923519 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:38.923405 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kube-rbac-proxy" containerID="cri-o://065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9" gracePeriod=30 Apr 24 21:30:39.009410 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.009348 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z"] Apr 24 21:30:39.009901 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.009875 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" Apr 24 21:30:39.009901 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.009901 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" Apr 24 21:30:39.010052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.009987 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="869d7610-9d15-4439-b9fb-dfe142a340d6" containerName="switch-graph-d41b1" Apr 24 21:30:39.013335 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.013314 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.015646 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.015623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3a238-predictor-serving-cert\"" Apr 24 21:30:39.015749 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.015657 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3a238-kube-rbac-proxy-sar-config\"" Apr 24 21:30:39.022644 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.022623 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z"] Apr 24 21:30:39.112600 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.112506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef252636-56e1-4ba4-8b22-135c16a6121b-success-200-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.112600 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.112563 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nwtm\" (UniqueName: \"kubernetes.io/projected/ef252636-56e1-4ba4-8b22-135c16a6121b-kube-api-access-2nwtm\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.112792 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.112637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.213690 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.213657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nwtm\" (UniqueName: \"kubernetes.io/projected/ef252636-56e1-4ba4-8b22-135c16a6121b-kube-api-access-2nwtm\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.213861 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.213700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.213861 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.213752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef252636-56e1-4ba4-8b22-135c16a6121b-success-200-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.213940 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:39.213878 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-3a238-predictor-serving-cert: secret "success-200-isvc-3a238-predictor-serving-cert" not found Apr 24 21:30:39.213983 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:39.213942 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls podName:ef252636-56e1-4ba4-8b22-135c16a6121b nodeName:}" failed. No retries permitted until 2026-04-24 21:30:39.713922848 +0000 UTC m=+862.119764281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls") pod "success-200-isvc-3a238-predictor-644f78f4dc-brf2z" (UID: "ef252636-56e1-4ba4-8b22-135c16a6121b") : secret "success-200-isvc-3a238-predictor-serving-cert" not found Apr 24 21:30:39.214443 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.214421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef252636-56e1-4ba4-8b22-135c16a6121b-success-200-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.234941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.234914 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nwtm\" (UniqueName: \"kubernetes.io/projected/ef252636-56e1-4ba4-8b22-135c16a6121b-kube-api-access-2nwtm\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.717747 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.717709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.720057 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.720037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls\") pod \"success-200-isvc-3a238-predictor-644f78f4dc-brf2z\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:39.870717 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.870680 2573 generic.go:358] "Generic (PLEG): container finished" podID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerID="065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9" exitCode=2 Apr 24 21:30:39.870717 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.870721 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerDied","Data":"065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9"} Apr 24 21:30:39.925241 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:39.925190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:40.043994 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.043960 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z"] Apr 24 21:30:40.047015 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:30:40.046977 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef252636_56e1_4ba4_8b22_135c16a6121b.slice/crio-a87eb842c1f3e75e7f06f819bba9685e46f9228f5482147d080a14c410bc0879 WatchSource:0}: Error finding container a87eb842c1f3e75e7f06f819bba9685e46f9228f5482147d080a14c410bc0879: Status 404 returned error can't find the container with id a87eb842c1f3e75e7f06f819bba9685e46f9228f5482147d080a14c410bc0879 Apr 24 21:30:40.048856 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.048838 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:30:40.757996 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.757959 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:30:40.875701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.875657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" event={"ID":"ef252636-56e1-4ba4-8b22-135c16a6121b","Type":"ContainerStarted","Data":"c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8"} Apr 24 21:30:40.875701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.875706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" event={"ID":"ef252636-56e1-4ba4-8b22-135c16a6121b","Type":"ContainerStarted","Data":"7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62"} Apr 24 21:30:40.876116 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.875720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" event={"ID":"ef252636-56e1-4ba4-8b22-135c16a6121b","Type":"ContainerStarted","Data":"a87eb842c1f3e75e7f06f819bba9685e46f9228f5482147d080a14c410bc0879"} Apr 24 21:30:40.876116 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.875881 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:40.876116 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.876008 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:40.877270 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.877248 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:30:40.895942 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:40.895897 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podStartSLOduration=2.8958828 podStartE2EDuration="2.8958828s" podCreationTimestamp="2026-04-24 21:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:40.893968787 +0000 UTC m=+863.299810239" watchObservedRunningTime="2026-04-24 21:30:40.8958828 +0000 UTC m=+863.301724250" Apr 24 21:30:41.574782 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:41.574735 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.28:8643/healthz\": dial tcp 10.132.0.28:8643: connect: connection refused" Apr 24 21:30:41.843644 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:41.843543 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:41.879850 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:41.879813 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:30:42.584927 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.584886 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 21:30:42.669346 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.669324 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:30:42.743367 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.743322 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bkcf\" (UniqueName: \"kubernetes.io/projected/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kube-api-access-9bkcf\") pod \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " Apr 24 21:30:42.743548 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.743394 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45ff4ebd-30bb-4cf0-984b-e93e5a155663-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " Apr 24 21:30:42.743548 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.743437 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kserve-provision-location\") pod \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " Apr 24 21:30:42.743548 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.743458 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45ff4ebd-30bb-4cf0-984b-e93e5a155663-proxy-tls\") pod \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\" (UID: \"45ff4ebd-30bb-4cf0-984b-e93e5a155663\") " Apr 24 21:30:42.743705 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.743682 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45ff4ebd-30bb-4cf0-984b-e93e5a155663" (UID: "45ff4ebd-30bb-4cf0-984b-e93e5a155663"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:42.743787 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.743759 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff4ebd-30bb-4cf0-984b-e93e5a155663-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "45ff4ebd-30bb-4cf0-984b-e93e5a155663" (UID: "45ff4ebd-30bb-4cf0-984b-e93e5a155663"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:42.745456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.745421 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ff4ebd-30bb-4cf0-984b-e93e5a155663-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "45ff4ebd-30bb-4cf0-984b-e93e5a155663" (UID: "45ff4ebd-30bb-4cf0-984b-e93e5a155663"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:42.745568 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.745485 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kube-api-access-9bkcf" (OuterVolumeSpecName: "kube-api-access-9bkcf") pod "45ff4ebd-30bb-4cf0-984b-e93e5a155663" (UID: "45ff4ebd-30bb-4cf0-984b-e93e5a155663"). InnerVolumeSpecName "kube-api-access-9bkcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:42.844172 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.844132 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/45ff4ebd-30bb-4cf0-984b-e93e5a155663-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.844172 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.844167 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kserve-provision-location\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.844405 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.844192 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45ff4ebd-30bb-4cf0-984b-e93e5a155663-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.844405 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.844203 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9bkcf\" (UniqueName: \"kubernetes.io/projected/45ff4ebd-30bb-4cf0-984b-e93e5a155663-kube-api-access-9bkcf\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.884260 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.884227 2573 generic.go:358] "Generic (PLEG): container finished" podID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerID="e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604" exitCode=0 Apr 24 21:30:42.884699 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.884298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerDied","Data":"e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604"} Apr 24 21:30:42.884699 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.884308 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" Apr 24 21:30:42.884699 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.884324 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4" event={"ID":"45ff4ebd-30bb-4cf0-984b-e93e5a155663","Type":"ContainerDied","Data":"8561b46152be5e1eeb560bfeb72c67767fec01ae644e67978a8701d22e43acf4"} Apr 24 21:30:42.884699 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.884339 2573 scope.go:117] "RemoveContainer" containerID="065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9" Apr 24 21:30:42.892930 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.892909 2573 scope.go:117] "RemoveContainer" containerID="e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604" Apr 24 21:30:42.902536 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.902512 2573 scope.go:117] "RemoveContainer" containerID="8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0" Apr 24 21:30:42.908253 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.908228 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4"] Apr 24 21:30:42.910508 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.910488 2573 scope.go:117] "RemoveContainer" containerID="065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9" Apr 24 21:30:42.910782 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:42.910761 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9\": container with ID starting with 065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9 not found: ID does not exist" containerID="065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9" Apr 24 21:30:42.910870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.910795 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9"} err="failed to get container status \"065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9\": rpc error: code = NotFound desc = could not find container \"065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9\": container with ID starting with 065010ead656a0ef72e2da87fb8376d13270ffceb0cc81155a5a3eff047af7b9 not found: ID does not exist" Apr 24 21:30:42.910870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.910820 2573 scope.go:117] "RemoveContainer" containerID="e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604" Apr 24 21:30:42.911101 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:42.911085 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604\": container with ID starting with e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604 not found: ID does not exist" containerID="e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604" Apr 24 21:30:42.911163 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.911109 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604"} err="failed to get container status \"e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604\": rpc error: code = NotFound desc = could not find container \"e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604\": container with ID starting with e41b7bdd59b11b62b000b363c5c39faf04004ee8065511e882da5fe4d061c604 not found: ID does not exist" Apr 24 21:30:42.911163 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.911131 2573 scope.go:117] "RemoveContainer" containerID="8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0" Apr 24 21:30:42.911390 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:30:42.911348 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0\": container with ID starting with 8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0 not found: ID does not exist" containerID="8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0" Apr 24 21:30:42.911456 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.911397 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0"} err="failed to get container status \"8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0\": rpc error: code = NotFound desc = could not find container \"8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0\": container with ID starting with 8798bd90bd27f938c51de09a073ff4464b0570f119e8acbc4efcd1f611d573f0 not found: ID does not exist" Apr 24 21:30:42.911684 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:42.911667 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4"] Apr 24 21:30:44.210955 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:44.210920 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" path="/var/lib/kubelet/pods/45ff4ebd-30bb-4cf0-984b-e93e5a155663/volumes" Apr 24 21:30:46.843864 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:46.843821 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:46.883845 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:46.883807 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:30:46.884374 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:46.884328 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:30:50.758078 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:50.758052 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:30:51.843762 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:51.843726 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:51.844131 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:51.843841 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:30:56.843855 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:56.843814 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:56.885229 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:30:56.885191 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:31:01.843051 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:01.843013 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:31:03.098398 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098344 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7"] Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098696 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="storage-initializer" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098707 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="storage-initializer" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098722 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kube-rbac-proxy" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098727 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kube-rbac-proxy" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098738 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098743 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098795 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kserve-container" Apr 24 21:31:03.098817 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.098802 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="45ff4ebd-30bb-4cf0-984b-e93e5a155663" containerName="kube-rbac-proxy" Apr 24 21:31:03.103299 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.103279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.105586 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.105564 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b868d-serving-cert\"" Apr 24 21:31:03.105679 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.105565 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b868d-kube-rbac-proxy-sar-config\"" Apr 24 21:31:03.112959 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.112931 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7"] Apr 24 21:31:03.217134 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.217098 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ea2ad55-ea51-4865-888c-7206e04f3c32-proxy-tls\") pod \"switch-graph-b868d-787648c54b-r6sn7\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.217298 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.217168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ea2ad55-ea51-4865-888c-7206e04f3c32-openshift-service-ca-bundle\") pod \"switch-graph-b868d-787648c54b-r6sn7\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.318464 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.318436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ea2ad55-ea51-4865-888c-7206e04f3c32-proxy-tls\") pod \"switch-graph-b868d-787648c54b-r6sn7\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.318644 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.318499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ea2ad55-ea51-4865-888c-7206e04f3c32-openshift-service-ca-bundle\") pod \"switch-graph-b868d-787648c54b-r6sn7\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.319148 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.319125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ea2ad55-ea51-4865-888c-7206e04f3c32-openshift-service-ca-bundle\") pod \"switch-graph-b868d-787648c54b-r6sn7\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.321037 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.321012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ea2ad55-ea51-4865-888c-7206e04f3c32-proxy-tls\") pod \"switch-graph-b868d-787648c54b-r6sn7\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.435183 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.435094 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.565132 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.565107 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7"] Apr 24 21:31:03.567729 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:31:03.567701 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea2ad55_ea51_4865_888c_7206e04f3c32.slice/crio-7ff9766f40b2077c6b2a2d40fa4d6c8480c1118acce970930ef25b5a2bb06afe WatchSource:0}: Error finding container 7ff9766f40b2077c6b2a2d40fa4d6c8480c1118acce970930ef25b5a2bb06afe: Status 404 returned error can't find the container with id 7ff9766f40b2077c6b2a2d40fa4d6c8480c1118acce970930ef25b5a2bb06afe Apr 24 21:31:03.952409 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.952344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" event={"ID":"7ea2ad55-ea51-4865-888c-7206e04f3c32","Type":"ContainerStarted","Data":"9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706"} Apr 24 21:31:03.952409 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.952411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" event={"ID":"7ea2ad55-ea51-4865-888c-7206e04f3c32","Type":"ContainerStarted","Data":"7ff9766f40b2077c6b2a2d40fa4d6c8480c1118acce970930ef25b5a2bb06afe"} Apr 24 21:31:03.952647 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.952492 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:03.968561 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:03.968493 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podStartSLOduration=0.968477878 podStartE2EDuration="968.477878ms" podCreationTimestamp="2026-04-24 21:31:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:03.967512235 +0000 UTC m=+886.373353707" watchObservedRunningTime="2026-04-24 21:31:03.968477878 +0000 UTC m=+886.374319328" Apr 24 21:31:06.843134 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:06.843047 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:31:06.884602 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:06.884564 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:31:08.969648 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:08.969611 2573 generic.go:358] "Generic (PLEG): container finished" podID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerID="f9c676c41a73d8dd0795291ca7b4d7b0e52234a190c3d3e357c168d8c7cf731a" exitCode=0 Apr 24 21:31:08.970011 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:08.969682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" event={"ID":"9ddb90b6-48de-4a26-87be-bafd5c0a59aa","Type":"ContainerDied","Data":"f9c676c41a73d8dd0795291ca7b4d7b0e52234a190c3d3e357c168d8c7cf731a"} Apr 24 21:31:08.991775 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:08.991753 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:31:09.069874 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.069813 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-openshift-service-ca-bundle\") pod \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " Apr 24 21:31:09.069874 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.069891 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls\") pod \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\" (UID: \"9ddb90b6-48de-4a26-87be-bafd5c0a59aa\") " Apr 24 21:31:09.070218 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.070192 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9ddb90b6-48de-4a26-87be-bafd5c0a59aa" (UID: "9ddb90b6-48de-4a26-87be-bafd5c0a59aa"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:09.071943 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.071919 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ddb90b6-48de-4a26-87be-bafd5c0a59aa" (UID: "9ddb90b6-48de-4a26-87be-bafd5c0a59aa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.171086 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.170998 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.171086 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.171033 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ddb90b6-48de-4a26-87be-bafd5c0a59aa-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.961960 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.961932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:31:09.973735 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.973704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" event={"ID":"9ddb90b6-48de-4a26-87be-bafd5c0a59aa","Type":"ContainerDied","Data":"37a754b5eb08516350f14fa50855d0446403cb33e9588e4b1df6c495a8821796"} Apr 24 21:31:09.974107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.973746 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x" Apr 24 21:31:09.974107 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:09.973752 2573 scope.go:117] "RemoveContainer" containerID="f9c676c41a73d8dd0795291ca7b4d7b0e52234a190c3d3e357c168d8c7cf731a" Apr 24 21:31:10.018292 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:10.018257 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x"] Apr 24 21:31:10.022162 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:10.022131 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x"] Apr 24 21:31:10.211321 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:10.211288 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" path="/var/lib/kubelet/pods/9ddb90b6-48de-4a26-87be-bafd5c0a59aa/volumes" Apr 24 21:31:16.885006 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:16.884966 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:31:18.121163 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:18.121135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:31:18.122205 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:18.122173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:31:26.885173 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:26.885141 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:31:39.016474 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.016438 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl"] Apr 24 21:31:39.017238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.016797 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" Apr 24 21:31:39.017238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.016807 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" Apr 24 21:31:39.017238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.016871 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ddb90b6-48de-4a26-87be-bafd5c0a59aa" containerName="model-chainer" Apr 24 21:31:39.019532 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.019513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.022049 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.022025 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3a238-serving-cert\"" Apr 24 21:31:39.022154 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.022028 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3a238-kube-rbac-proxy-sar-config\"" Apr 24 21:31:39.026387 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.026349 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl"] Apr 24 21:31:39.113688 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.113652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.113853 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.113768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-openshift-service-ca-bundle\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.214314 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.214271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-openshift-service-ca-bundle\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.214314 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.214321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.214549 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:31:39.214458 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-3a238-serving-cert: secret "sequence-graph-3a238-serving-cert" not found Apr 24 21:31:39.214549 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:31:39.214514 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls podName:e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:39.714497899 +0000 UTC m=+922.120339329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls") pod "sequence-graph-3a238-f58988f4b-8hmrl" (UID: "e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2") : secret "sequence-graph-3a238-serving-cert" not found Apr 24 21:31:39.214941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.214918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-openshift-service-ca-bundle\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.719414 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.719377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.721833 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.721802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls\") pod \"sequence-graph-3a238-f58988f4b-8hmrl\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:39.930101 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:39.930061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:40.054026 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:40.054002 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl"] Apr 24 21:31:40.056716 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:31:40.056682 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8bfc6b6_00e2_4dd7_9df2_44f98dbb24d2.slice/crio-f729af3cc5f34ab5dbf7babe591f247fc5687a263f2ac0b7dc21e754c69623b2 WatchSource:0}: Error finding container f729af3cc5f34ab5dbf7babe591f247fc5687a263f2ac0b7dc21e754c69623b2: Status 404 returned error can't find the container with id f729af3cc5f34ab5dbf7babe591f247fc5687a263f2ac0b7dc21e754c69623b2 Apr 24 21:31:40.072318 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:40.072288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" event={"ID":"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2","Type":"ContainerStarted","Data":"f729af3cc5f34ab5dbf7babe591f247fc5687a263f2ac0b7dc21e754c69623b2"} Apr 24 21:31:41.076233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:41.076198 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" event={"ID":"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2","Type":"ContainerStarted","Data":"d24868ea1eef50138883172b969eb602dbf7e1f6017e560cd6ec93f51b413ee3"} Apr 24 21:31:41.076652 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:41.076297 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:47.084943 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:47.084913 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:31:47.102916 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:31:47.102866 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podStartSLOduration=8.102852362 podStartE2EDuration="8.102852362s" podCreationTimestamp="2026-04-24 21:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:41.090595821 +0000 UTC m=+923.496437275" watchObservedRunningTime="2026-04-24 21:31:47.102852362 +0000 UTC m=+929.508693813" Apr 24 21:36:18.144176 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:36:18.144140 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:36:18.146246 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:36:18.146225 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:39:17.759960 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.759914 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7"] Apr 24 21:39:17.762493 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.760179 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" containerID="cri-o://9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706" gracePeriod=30 Apr 24 21:39:17.879012 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.878973 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8"] Apr 24 21:39:17.879440 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.879380 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" containerID="cri-o://913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780" gracePeriod=30 Apr 24 21:39:17.879591 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.879409 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kube-rbac-proxy" containerID="cri-o://14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052" gracePeriod=30 Apr 24 21:39:17.970437 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.970401 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw"] Apr 24 21:39:17.973986 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.973965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:17.976448 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.976424 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f3231-kube-rbac-proxy-sar-config\"" Apr 24 21:39:17.976565 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.976424 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f3231-predictor-serving-cert\"" Apr 24 21:39:17.997197 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:17.997159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw"] Apr 24 21:39:18.012774 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.012690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9zlv\" (UniqueName: \"kubernetes.io/projected/13999f24-2195-46ee-a8de-d2687ebef412-kube-api-access-k9zlv\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.012774 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.012737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13999f24-2195-46ee-a8de-d2687ebef412-success-200-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.013098 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.012803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13999f24-2195-46ee-a8de-d2687ebef412-proxy-tls\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.113563 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.113527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9zlv\" (UniqueName: \"kubernetes.io/projected/13999f24-2195-46ee-a8de-d2687ebef412-kube-api-access-k9zlv\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.113758 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.113582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13999f24-2195-46ee-a8de-d2687ebef412-success-200-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.113758 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.113620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13999f24-2195-46ee-a8de-d2687ebef412-proxy-tls\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.116044 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.116019 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f3231-kube-rbac-proxy-sar-config\"" Apr 24 21:39:18.116548 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.116533 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f3231-predictor-serving-cert\"" Apr 24 21:39:18.122076 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.122048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9zlv\" (UniqueName: \"kubernetes.io/projected/13999f24-2195-46ee-a8de-d2687ebef412-kube-api-access-k9zlv\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.125283 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.125252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13999f24-2195-46ee-a8de-d2687ebef412-success-200-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.130413 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.130296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13999f24-2195-46ee-a8de-d2687ebef412-proxy-tls\") pod \"success-200-isvc-f3231-predictor-788854545f-lsqvw\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.285793 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.285703 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:18.422161 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.422126 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw"] Apr 24 21:39:18.425059 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:39:18.425024 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13999f24_2195_46ee_a8de_d2687ebef412.slice/crio-c7f1ce9675db0311288b3bb62c9f962cf35b5a6da5d438411841b11bddc99dc9 WatchSource:0}: Error finding container c7f1ce9675db0311288b3bb62c9f962cf35b5a6da5d438411841b11bddc99dc9: Status 404 returned error can't find the container with id c7f1ce9675db0311288b3bb62c9f962cf35b5a6da5d438411841b11bddc99dc9 Apr 24 21:39:18.426925 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.426910 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:39:18.557649 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.557613 2573 generic.go:358] "Generic (PLEG): container finished" podID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerID="14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052" exitCode=2 Apr 24 21:39:18.557933 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.557689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" event={"ID":"1b444c97-fd1e-4685-8d79-0790643b0ed3","Type":"ContainerDied","Data":"14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052"} Apr 24 21:39:18.559214 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.559189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" event={"ID":"13999f24-2195-46ee-a8de-d2687ebef412","Type":"ContainerStarted","Data":"f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4"} Apr 24 21:39:18.559316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:18.559222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" event={"ID":"13999f24-2195-46ee-a8de-d2687ebef412","Type":"ContainerStarted","Data":"c7f1ce9675db0311288b3bb62c9f962cf35b5a6da5d438411841b11bddc99dc9"} Apr 24 21:39:19.563863 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:19.563817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" event={"ID":"13999f24-2195-46ee-a8de-d2687ebef412","Type":"ContainerStarted","Data":"2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c"} Apr 24 21:39:19.564286 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:19.564071 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:19.564286 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:19.564173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:19.565584 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:19.565559 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:39:19.590300 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:19.590242 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podStartSLOduration=2.5902271040000002 podStartE2EDuration="2.590227104s" podCreationTimestamp="2026-04-24 21:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:19.58924313 +0000 UTC m=+1381.995084590" watchObservedRunningTime="2026-04-24 21:39:19.590227104 +0000 UTC m=+1381.996068553" Apr 24 21:39:19.960245 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:19.960152 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:20.567008 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:20.566963 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:39:20.753485 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:20.753436 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 21:39:20.757784 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:20.757754 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 21:39:21.334295 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.334270 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:39:21.443305 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.443223 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p52vb\" (UniqueName: \"kubernetes.io/projected/1b444c97-fd1e-4685-8d79-0790643b0ed3-kube-api-access-p52vb\") pod \"1b444c97-fd1e-4685-8d79-0790643b0ed3\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " Apr 24 21:39:21.443305 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.443259 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b444c97-fd1e-4685-8d79-0790643b0ed3-success-200-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"1b444c97-fd1e-4685-8d79-0790643b0ed3\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " Apr 24 21:39:21.443534 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.443309 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b444c97-fd1e-4685-8d79-0790643b0ed3-proxy-tls\") pod \"1b444c97-fd1e-4685-8d79-0790643b0ed3\" (UID: \"1b444c97-fd1e-4685-8d79-0790643b0ed3\") " Apr 24 21:39:21.443705 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.443676 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b444c97-fd1e-4685-8d79-0790643b0ed3-success-200-isvc-b868d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-b868d-kube-rbac-proxy-sar-config") pod "1b444c97-fd1e-4685-8d79-0790643b0ed3" (UID: "1b444c97-fd1e-4685-8d79-0790643b0ed3"). InnerVolumeSpecName "success-200-isvc-b868d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:21.445465 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.445435 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b444c97-fd1e-4685-8d79-0790643b0ed3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b444c97-fd1e-4685-8d79-0790643b0ed3" (UID: "1b444c97-fd1e-4685-8d79-0790643b0ed3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:21.445585 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.445481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b444c97-fd1e-4685-8d79-0790643b0ed3-kube-api-access-p52vb" (OuterVolumeSpecName: "kube-api-access-p52vb") pod "1b444c97-fd1e-4685-8d79-0790643b0ed3" (UID: "1b444c97-fd1e-4685-8d79-0790643b0ed3"). InnerVolumeSpecName "kube-api-access-p52vb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:21.544547 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.544502 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b444c97-fd1e-4685-8d79-0790643b0ed3-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:21.544547 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.544539 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p52vb\" (UniqueName: \"kubernetes.io/projected/1b444c97-fd1e-4685-8d79-0790643b0ed3-kube-api-access-p52vb\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:21.544547 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.544549 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b444c97-fd1e-4685-8d79-0790643b0ed3-success-200-isvc-b868d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:21.571262 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.571222 2573 generic.go:358] "Generic (PLEG): container finished" podID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerID="913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780" exitCode=0 Apr 24 21:39:21.571674 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.571285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" event={"ID":"1b444c97-fd1e-4685-8d79-0790643b0ed3","Type":"ContainerDied","Data":"913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780"} Apr 24 21:39:21.571674 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.571322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" event={"ID":"1b444c97-fd1e-4685-8d79-0790643b0ed3","Type":"ContainerDied","Data":"f25caf9a817685abd8e50867f6bac08b318df7c91c25ccd07ac9f7c9d7aacc45"} Apr 24 21:39:21.571674 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.571293 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8" Apr 24 21:39:21.571674 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.571338 2573 scope.go:117] "RemoveContainer" containerID="14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052" Apr 24 21:39:21.580073 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.580055 2573 scope.go:117] "RemoveContainer" containerID="913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780" Apr 24 21:39:21.587419 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.587399 2573 scope.go:117] "RemoveContainer" containerID="14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052" Apr 24 21:39:21.587708 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:21.587688 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052\": container with ID starting with 14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052 not found: ID does not exist" containerID="14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052" Apr 24 21:39:21.587771 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.587718 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052"} err="failed to get container status \"14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052\": rpc error: code = NotFound desc = could not find container \"14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052\": container with ID starting with 14030f48c1dea826e2512a4f5480912e178a38e685d417b36f5c1d6b3dce2052 not found: ID does not exist" Apr 24 21:39:21.587771 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.587737 2573 scope.go:117] "RemoveContainer" containerID="913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780" Apr 24 21:39:21.587995 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:21.587978 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780\": container with ID starting with 913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780 not found: ID does not exist" containerID="913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780" Apr 24 21:39:21.588034 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.588002 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780"} err="failed to get container status \"913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780\": rpc error: code = NotFound desc = could not find container \"913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780\": container with ID starting with 913c38bbf50ed902587c65875dfa8f070275721e1a3e83d0fa47a1e45ec82780 not found: ID does not exist" Apr 24 21:39:21.596392 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.596344 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8"] Apr 24 21:39:21.600966 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:21.600939 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8"] Apr 24 21:39:22.211285 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:22.211251 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" path="/var/lib/kubelet/pods/1b444c97-fd1e-4685-8d79-0790643b0ed3/volumes" Apr 24 21:39:24.960232 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:24.960191 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:25.571163 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:25.571132 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:39:25.571773 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:25.571746 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:39:29.959773 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:29.959736 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:29.960241 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:29.959848 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:39:34.959654 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:34.959615 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:35.572397 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:35.572339 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:39:39.960183 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:39.960140 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:44.960017 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:44.959973 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:45.572514 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:45.572474 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:39:47.905209 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:47.905181 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:39:48.071966 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.071871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ea2ad55-ea51-4865-888c-7206e04f3c32-proxy-tls\") pod \"7ea2ad55-ea51-4865-888c-7206e04f3c32\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " Apr 24 21:39:48.071966 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.071918 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ea2ad55-ea51-4865-888c-7206e04f3c32-openshift-service-ca-bundle\") pod \"7ea2ad55-ea51-4865-888c-7206e04f3c32\" (UID: \"7ea2ad55-ea51-4865-888c-7206e04f3c32\") " Apr 24 21:39:48.072310 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.072277 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea2ad55-ea51-4865-888c-7206e04f3c32-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7ea2ad55-ea51-4865-888c-7206e04f3c32" (UID: "7ea2ad55-ea51-4865-888c-7206e04f3c32"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:48.073954 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.073933 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea2ad55-ea51-4865-888c-7206e04f3c32-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7ea2ad55-ea51-4865-888c-7206e04f3c32" (UID: "7ea2ad55-ea51-4865-888c-7206e04f3c32"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:48.172676 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.172633 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ea2ad55-ea51-4865-888c-7206e04f3c32-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:48.172676 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.172667 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ea2ad55-ea51-4865-888c-7206e04f3c32-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:48.664488 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.664450 2573 generic.go:358] "Generic (PLEG): container finished" podID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerID="9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706" exitCode=0 Apr 24 21:39:48.664759 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.664518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" event={"ID":"7ea2ad55-ea51-4865-888c-7206e04f3c32","Type":"ContainerDied","Data":"9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706"} Apr 24 21:39:48.664759 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.664520 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" Apr 24 21:39:48.664759 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.664544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7" event={"ID":"7ea2ad55-ea51-4865-888c-7206e04f3c32","Type":"ContainerDied","Data":"7ff9766f40b2077c6b2a2d40fa4d6c8480c1118acce970930ef25b5a2bb06afe"} Apr 24 21:39:48.664759 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.664560 2573 scope.go:117] "RemoveContainer" containerID="9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706" Apr 24 21:39:48.672350 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.672332 2573 scope.go:117] "RemoveContainer" containerID="9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706" Apr 24 21:39:48.672639 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:48.672620 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706\": container with ID starting with 9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706 not found: ID does not exist" containerID="9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706" Apr 24 21:39:48.672706 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.672649 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706"} err="failed to get container status \"9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706\": rpc error: code = NotFound desc = could not find container \"9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706\": container with ID starting with 9dd4fb3edc0466dd87c779888e2057903a420f0a388e54e0dc79920a2100a706 not found: ID does not exist" Apr 24 21:39:48.680504 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.680475 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7"] Apr 24 21:39:48.685070 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:48.685042 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7"] Apr 24 21:39:50.210894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:50.210862 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" path="/var/lib/kubelet/pods/7ea2ad55-ea51-4865-888c-7206e04f3c32/volumes" Apr 24 21:39:53.763019 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.762985 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl"] Apr 24 21:39:53.763435 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.763223 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" containerID="cri-o://d24868ea1eef50138883172b969eb602dbf7e1f6017e560cd6ec93f51b413ee3" gracePeriod=30 Apr 24 21:39:53.887768 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.887735 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z"] Apr 24 21:39:53.888061 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.888014 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" containerID="cri-o://7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62" gracePeriod=30 Apr 24 21:39:53.888159 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.888059 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kube-rbac-proxy" containerID="cri-o://c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8" gracePeriod=30 Apr 24 21:39:53.935038 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935002 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk"] Apr 24 21:39:53.935333 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935322 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" Apr 24 21:39:53.935395 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935335 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" Apr 24 21:39:53.935395 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935367 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" Apr 24 21:39:53.935395 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935375 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" Apr 24 21:39:53.935395 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935391 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kube-rbac-proxy" Apr 24 21:39:53.935518 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935399 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kube-rbac-proxy" Apr 24 21:39:53.935518 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935454 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ea2ad55-ea51-4865-888c-7206e04f3c32" containerName="switch-graph-b868d" Apr 24 21:39:53.935518 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935463 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kube-rbac-proxy" Apr 24 21:39:53.935518 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.935473 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b444c97-fd1e-4685-8d79-0790643b0ed3" containerName="kserve-container" Apr 24 21:39:53.940053 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.940035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:53.942367 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.942335 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c1a35-predictor-serving-cert\"" Apr 24 21:39:53.942470 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.942452 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c1a35-kube-rbac-proxy-sar-config\"" Apr 24 21:39:53.948213 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:53.948182 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk"] Apr 24 21:39:54.130490 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.130400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gjl\" (UniqueName: \"kubernetes.io/projected/c2b3f2bd-2603-4bff-8cdc-d7717704beab-kube-api-access-r2gjl\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.130490 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.130456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2b3f2bd-2603-4bff-8cdc-d7717704beab-success-200-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.130681 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.130527 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.231288 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.231252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gjl\" (UniqueName: \"kubernetes.io/projected/c2b3f2bd-2603-4bff-8cdc-d7717704beab-kube-api-access-r2gjl\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.231480 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.231298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2b3f2bd-2603-4bff-8cdc-d7717704beab-success-200-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.231480 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.231345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.231578 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:54.231478 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-serving-cert: secret "success-200-isvc-c1a35-predictor-serving-cert" not found Apr 24 21:39:54.231578 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:54.231541 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls podName:c2b3f2bd-2603-4bff-8cdc-d7717704beab nodeName:}" failed. No retries permitted until 2026-04-24 21:39:54.731521487 +0000 UTC m=+1417.137362916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls") pod "success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" (UID: "c2b3f2bd-2603-4bff-8cdc-d7717704beab") : secret "success-200-isvc-c1a35-predictor-serving-cert" not found Apr 24 21:39:54.232037 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.232016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2b3f2bd-2603-4bff-8cdc-d7717704beab-success-200-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.249477 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.249447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gjl\" (UniqueName: \"kubernetes.io/projected/c2b3f2bd-2603-4bff-8cdc-d7717704beab-kube-api-access-r2gjl\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.688246 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.688201 2573 generic.go:358] "Generic (PLEG): container finished" podID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerID="c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8" exitCode=2 Apr 24 21:39:54.688426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.688269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" event={"ID":"ef252636-56e1-4ba4-8b22-135c16a6121b","Type":"ContainerDied","Data":"c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8"} Apr 24 21:39:54.735042 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.735012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.737494 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.737468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls\") pod \"success-200-isvc-c1a35-predictor-869c6689b7-jfwbk\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.852911 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.852871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:54.989567 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:54.989540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk"] Apr 24 21:39:54.992281 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:39:54.992253 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b3f2bd_2603_4bff_8cdc_d7717704beab.slice/crio-dcaeb827984f48aa4f2da22f755e4340004c29b1704cde5150bda9427611c25c WatchSource:0}: Error finding container dcaeb827984f48aa4f2da22f755e4340004c29b1704cde5150bda9427611c25c: Status 404 returned error can't find the container with id dcaeb827984f48aa4f2da22f755e4340004c29b1704cde5150bda9427611c25c Apr 24 21:39:55.572069 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:55.572026 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:39:55.693865 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:55.693823 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" event={"ID":"c2b3f2bd-2603-4bff-8cdc-d7717704beab","Type":"ContainerStarted","Data":"159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351"} Apr 24 21:39:55.694045 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:55.693874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" event={"ID":"c2b3f2bd-2603-4bff-8cdc-d7717704beab","Type":"ContainerStarted","Data":"3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269"} Apr 24 21:39:55.694045 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:55.693889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" event={"ID":"c2b3f2bd-2603-4bff-8cdc-d7717704beab","Type":"ContainerStarted","Data":"dcaeb827984f48aa4f2da22f755e4340004c29b1704cde5150bda9427611c25c"} Apr 24 21:39:55.694045 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:55.693909 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:55.717689 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:55.717638 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podStartSLOduration=2.717623346 podStartE2EDuration="2.717623346s" podCreationTimestamp="2026-04-24 21:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:55.71523094 +0000 UTC m=+1418.121072391" watchObservedRunningTime="2026-04-24 21:39:55.717623346 +0000 UTC m=+1418.123464875" Apr 24 21:39:56.698027 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:56.697986 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:39:56.699457 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:56.699426 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:39:56.880342 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:56.880302 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 24 21:39:56.884651 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:56.884619 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 21:39:57.082995 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.082957 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:57.539898 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.539874 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:39:57.658915 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.658823 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls\") pod \"ef252636-56e1-4ba4-8b22-135c16a6121b\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " Apr 24 21:39:57.658915 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.658883 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef252636-56e1-4ba4-8b22-135c16a6121b-success-200-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"ef252636-56e1-4ba4-8b22-135c16a6121b\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " Apr 24 21:39:57.659109 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.658955 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nwtm\" (UniqueName: \"kubernetes.io/projected/ef252636-56e1-4ba4-8b22-135c16a6121b-kube-api-access-2nwtm\") pod \"ef252636-56e1-4ba4-8b22-135c16a6121b\" (UID: \"ef252636-56e1-4ba4-8b22-135c16a6121b\") " Apr 24 21:39:57.659289 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.659262 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef252636-56e1-4ba4-8b22-135c16a6121b-success-200-isvc-3a238-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-3a238-kube-rbac-proxy-sar-config") pod "ef252636-56e1-4ba4-8b22-135c16a6121b" (UID: "ef252636-56e1-4ba4-8b22-135c16a6121b"). InnerVolumeSpecName "success-200-isvc-3a238-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:57.661031 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.661005 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ef252636-56e1-4ba4-8b22-135c16a6121b" (UID: "ef252636-56e1-4ba4-8b22-135c16a6121b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:57.661235 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.661212 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef252636-56e1-4ba4-8b22-135c16a6121b-kube-api-access-2nwtm" (OuterVolumeSpecName: "kube-api-access-2nwtm") pod "ef252636-56e1-4ba4-8b22-135c16a6121b" (UID: "ef252636-56e1-4ba4-8b22-135c16a6121b"). InnerVolumeSpecName "kube-api-access-2nwtm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:57.702738 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.702706 2573 generic.go:358] "Generic (PLEG): container finished" podID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerID="7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62" exitCode=0 Apr 24 21:39:57.703133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.702777 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" Apr 24 21:39:57.703133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.702792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" event={"ID":"ef252636-56e1-4ba4-8b22-135c16a6121b","Type":"ContainerDied","Data":"7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62"} Apr 24 21:39:57.703133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.702838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z" event={"ID":"ef252636-56e1-4ba4-8b22-135c16a6121b","Type":"ContainerDied","Data":"a87eb842c1f3e75e7f06f819bba9685e46f9228f5482147d080a14c410bc0879"} Apr 24 21:39:57.703133 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.702855 2573 scope.go:117] "RemoveContainer" containerID="c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8" Apr 24 21:39:57.703484 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.703458 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:39:57.712290 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.712268 2573 scope.go:117] "RemoveContainer" containerID="7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62" Apr 24 21:39:57.719635 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.719609 2573 scope.go:117] "RemoveContainer" containerID="c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8" Apr 24 21:39:57.719892 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:57.719869 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8\": container with ID starting with c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8 not found: ID does not exist" containerID="c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8" Apr 24 21:39:57.719977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.719906 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8"} err="failed to get container status \"c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8\": rpc error: code = NotFound desc = could not find container \"c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8\": container with ID starting with c79a62804e069e7da724c64913dd11187db25a282e5875d3db64d52b4feeb8a8 not found: ID does not exist" Apr 24 21:39:57.719977 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.719929 2573 scope.go:117] "RemoveContainer" containerID="7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62" Apr 24 21:39:57.720160 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:39:57.720139 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62\": container with ID starting with 7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62 not found: ID does not exist" containerID="7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62" Apr 24 21:39:57.720196 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.720167 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62"} err="failed to get container status \"7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62\": rpc error: code = NotFound desc = could not find container \"7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62\": container with ID starting with 7fcdf122f05c10aed6b9d548c891b2701d1993cb6a0d6ff8cf2539bae6c20a62 not found: ID does not exist" Apr 24 21:39:57.724244 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.724223 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z"] Apr 24 21:39:57.728218 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.728197 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z"] Apr 24 21:39:57.759850 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.759809 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nwtm\" (UniqueName: \"kubernetes.io/projected/ef252636-56e1-4ba4-8b22-135c16a6121b-kube-api-access-2nwtm\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:57.759850 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.759847 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef252636-56e1-4ba4-8b22-135c16a6121b-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:57.759850 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:57.759861 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef252636-56e1-4ba4-8b22-135c16a6121b-success-200-isvc-3a238-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:39:58.210686 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:39:58.210648 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" path="/var/lib/kubelet/pods/ef252636-56e1-4ba4-8b22-135c16a6121b/volumes" Apr 24 21:40:02.083757 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:02.083716 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:02.708128 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:02.708101 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:40:02.708714 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:02.708685 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:40:05.572885 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:05.572803 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:40:07.083172 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:07.083128 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:07.083596 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:07.083243 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:40:12.082924 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:12.082876 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:12.708757 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:12.708711 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:40:17.083562 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.083517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:17.988862 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.988828 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2"] Apr 24 21:40:17.989279 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.989260 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" Apr 24 21:40:17.989382 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.989282 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" Apr 24 21:40:17.989382 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.989307 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kube-rbac-proxy" Apr 24 21:40:17.989382 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.989316 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kube-rbac-proxy" Apr 24 21:40:17.989534 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.989430 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kserve-container" Apr 24 21:40:17.989534 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.989446 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef252636-56e1-4ba4-8b22-135c16a6121b" containerName="kube-rbac-proxy" Apr 24 21:40:17.992464 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.992443 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:17.994947 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.994924 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f3231-serving-cert\"" Apr 24 21:40:17.995090 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:17.994945 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f3231-kube-rbac-proxy-sar-config\"" Apr 24 21:40:18.001525 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.001498 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2"] Apr 24 21:40:18.017695 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.017661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf598abc-362a-4194-9c7c-8f53960c1ec8-openshift-service-ca-bundle\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.017856 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.017722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.118319 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.118285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf598abc-362a-4194-9c7c-8f53960c1ec8-openshift-service-ca-bundle\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.118683 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.118335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.118937 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.118915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf598abc-362a-4194-9c7c-8f53960c1ec8-openshift-service-ca-bundle\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.120798 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.120781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f3231-serving-cert\"" Apr 24 21:40:18.128938 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:18.128914 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-f3231-serving-cert: secret "ensemble-graph-f3231-serving-cert" not found Apr 24 21:40:18.129061 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:18.128980 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls podName:bf598abc-362a-4194-9c7c-8f53960c1ec8 nodeName:}" failed. No retries permitted until 2026-04-24 21:40:18.628961246 +0000 UTC m=+1441.034802686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls") pod "ensemble-graph-f3231-7b5968d9d6-rbjc2" (UID: "bf598abc-362a-4194-9c7c-8f53960c1ec8") : secret "ensemble-graph-f3231-serving-cert" not found Apr 24 21:40:18.723750 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.723697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.726190 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.726158 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls\") pod \"ensemble-graph-f3231-7b5968d9d6-rbjc2\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:18.903198 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:18.903162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:19.023608 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:19.023567 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2"] Apr 24 21:40:19.026797 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:40:19.026763 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf598abc_362a_4194_9c7c_8f53960c1ec8.slice/crio-7420f10fda74d3df872636d392508d14c9d25c0746fb0fc2baf67f01382978a6 WatchSource:0}: Error finding container 7420f10fda74d3df872636d392508d14c9d25c0746fb0fc2baf67f01382978a6: Status 404 returned error can't find the container with id 7420f10fda74d3df872636d392508d14c9d25c0746fb0fc2baf67f01382978a6 Apr 24 21:40:19.774576 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:19.774537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" event={"ID":"bf598abc-362a-4194-9c7c-8f53960c1ec8","Type":"ContainerStarted","Data":"7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f"} Apr 24 21:40:19.774576 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:19.774574 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" event={"ID":"bf598abc-362a-4194-9c7c-8f53960c1ec8","Type":"ContainerStarted","Data":"7420f10fda74d3df872636d392508d14c9d25c0746fb0fc2baf67f01382978a6"} Apr 24 21:40:19.775169 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:19.774669 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:19.792069 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:19.792018 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podStartSLOduration=2.7920031119999997 podStartE2EDuration="2.792003112s" podCreationTimestamp="2026-04-24 21:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:19.79075366 +0000 UTC m=+1442.196595125" watchObservedRunningTime="2026-04-24 21:40:19.792003112 +0000 UTC m=+1442.197844564" Apr 24 21:40:22.083463 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:22.083422 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:22.709262 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:22.709220 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:40:23.787535 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.787500 2573 generic.go:358] "Generic (PLEG): container finished" podID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerID="d24868ea1eef50138883172b969eb602dbf7e1f6017e560cd6ec93f51b413ee3" exitCode=0 Apr 24 21:40:23.787991 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.787576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" event={"ID":"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2","Type":"ContainerDied","Data":"d24868ea1eef50138883172b969eb602dbf7e1f6017e560cd6ec93f51b413ee3"} Apr 24 21:40:23.908596 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.908568 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:40:23.967258 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.967226 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls\") pod \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " Apr 24 21:40:23.967447 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.967328 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-openshift-service-ca-bundle\") pod \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\" (UID: \"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2\") " Apr 24 21:40:23.967701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.967680 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" (UID: "e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:23.969317 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:23.969297 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" (UID: "e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:24.068959 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.068877 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:40:24.068959 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.068910 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:40:24.791815 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.791779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" event={"ID":"e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2","Type":"ContainerDied","Data":"f729af3cc5f34ab5dbf7babe591f247fc5687a263f2ac0b7dc21e754c69623b2"} Apr 24 21:40:24.791815 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.791824 2573 scope.go:117] "RemoveContainer" containerID="d24868ea1eef50138883172b969eb602dbf7e1f6017e560cd6ec93f51b413ee3" Apr 24 21:40:24.792349 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.791831 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl" Apr 24 21:40:24.810299 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.810256 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl"] Apr 24 21:40:24.814209 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:24.814174 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl"] Apr 24 21:40:25.782842 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:25.782810 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:26.211460 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:26.211381 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" path="/var/lib/kubelet/pods/e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2/volumes" Apr 24 21:40:28.044900 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.044817 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2"] Apr 24 21:40:28.045289 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.045029 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" containerID="cri-o://7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f" gracePeriod=30 Apr 24 21:40:28.152768 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.152731 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw"] Apr 24 21:40:28.153275 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.153077 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" containerID="cri-o://f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4" gracePeriod=30 Apr 24 21:40:28.153275 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.153142 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kube-rbac-proxy" containerID="cri-o://2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c" gracePeriod=30 Apr 24 21:40:28.194221 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.194174 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr"] Apr 24 21:40:28.194607 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.194589 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" Apr 24 21:40:28.194607 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.194607 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" Apr 24 21:40:28.194768 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.194670 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8bfc6b6-00e2-4dd7-9df2-44f98dbb24d2" containerName="sequence-graph-3a238" Apr 24 21:40:28.199283 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.199256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.202200 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.202174 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-75b0d-predictor-serving-cert\"" Apr 24 21:40:28.202316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.202261 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-75b0d-kube-rbac-proxy-sar-config\"" Apr 24 21:40:28.221267 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.221236 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr"] Apr 24 21:40:28.306799 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.306704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d6580f8e-1f40-4415-b2d8-36099f4708a8-success-200-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.306799 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.306754 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngqj\" (UniqueName: \"kubernetes.io/projected/d6580f8e-1f40-4415-b2d8-36099f4708a8-kube-api-access-gngqj\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.307026 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.306882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.407311 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.407278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d6580f8e-1f40-4415-b2d8-36099f4708a8-success-200-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.407528 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.407320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gngqj\" (UniqueName: \"kubernetes.io/projected/d6580f8e-1f40-4415-b2d8-36099f4708a8-kube-api-access-gngqj\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.407528 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.407398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.407528 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:28.407516 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-serving-cert: secret "success-200-isvc-75b0d-predictor-serving-cert" not found Apr 24 21:40:28.407761 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:28.407611 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls podName:d6580f8e-1f40-4415-b2d8-36099f4708a8 nodeName:}" failed. No retries permitted until 2026-04-24 21:40:28.907588005 +0000 UTC m=+1451.313429436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls") pod "success-200-isvc-75b0d-predictor-9549449fc-92fhr" (UID: "d6580f8e-1f40-4415-b2d8-36099f4708a8") : secret "success-200-isvc-75b0d-predictor-serving-cert" not found Apr 24 21:40:28.408077 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.408053 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d6580f8e-1f40-4415-b2d8-36099f4708a8-success-200-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.416941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.416911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngqj\" (UniqueName: \"kubernetes.io/projected/d6580f8e-1f40-4415-b2d8-36099f4708a8-kube-api-access-gngqj\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.807103 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.807069 2573 generic.go:358] "Generic (PLEG): container finished" podID="13999f24-2195-46ee-a8de-d2687ebef412" containerID="2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c" exitCode=2 Apr 24 21:40:28.807290 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.807138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" event={"ID":"13999f24-2195-46ee-a8de-d2687ebef412","Type":"ContainerDied","Data":"2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c"} Apr 24 21:40:28.911267 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.911227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:28.913802 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:28.913772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls\") pod \"success-200-isvc-75b0d-predictor-9549449fc-92fhr\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:29.117320 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.117218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:29.247717 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.247685 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr"] Apr 24 21:40:29.250975 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:40:29.250942 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6580f8e_1f40_4415_b2d8_36099f4708a8.slice/crio-af1bcc6726172dbdf72f9fc3dd023e50ededc1ed44b981d6d34e29c44bff4b1d WatchSource:0}: Error finding container af1bcc6726172dbdf72f9fc3dd023e50ededc1ed44b981d6d34e29c44bff4b1d: Status 404 returned error can't find the container with id af1bcc6726172dbdf72f9fc3dd023e50ededc1ed44b981d6d34e29c44bff4b1d Apr 24 21:40:29.811326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.811283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" event={"ID":"d6580f8e-1f40-4415-b2d8-36099f4708a8","Type":"ContainerStarted","Data":"d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563"} Apr 24 21:40:29.811326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.811329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" event={"ID":"d6580f8e-1f40-4415-b2d8-36099f4708a8","Type":"ContainerStarted","Data":"2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8"} Apr 24 21:40:29.811566 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.811343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" event={"ID":"d6580f8e-1f40-4415-b2d8-36099f4708a8","Type":"ContainerStarted","Data":"af1bcc6726172dbdf72f9fc3dd023e50ededc1ed44b981d6d34e29c44bff4b1d"} Apr 24 21:40:29.811566 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.811483 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:29.811661 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.811618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:29.812824 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.812801 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:40:29.828830 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:29.828782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podStartSLOduration=1.828768818 podStartE2EDuration="1.828768818s" podCreationTimestamp="2026-04-24 21:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:29.828083657 +0000 UTC m=+1452.233925109" watchObservedRunningTime="2026-04-24 21:40:29.828768818 +0000 UTC m=+1452.234610268" Apr 24 21:40:30.567571 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:30.567525 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 24 21:40:30.781585 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:30.781540 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:30.814887 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:30.814848 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:40:31.506201 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.506175 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:40:31.635707 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.635614 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13999f24-2195-46ee-a8de-d2687ebef412-proxy-tls\") pod \"13999f24-2195-46ee-a8de-d2687ebef412\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " Apr 24 21:40:31.635707 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.635686 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9zlv\" (UniqueName: \"kubernetes.io/projected/13999f24-2195-46ee-a8de-d2687ebef412-kube-api-access-k9zlv\") pod \"13999f24-2195-46ee-a8de-d2687ebef412\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " Apr 24 21:40:31.636186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.635786 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13999f24-2195-46ee-a8de-d2687ebef412-success-200-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"13999f24-2195-46ee-a8de-d2687ebef412\" (UID: \"13999f24-2195-46ee-a8de-d2687ebef412\") " Apr 24 21:40:31.636186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.636139 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13999f24-2195-46ee-a8de-d2687ebef412-success-200-isvc-f3231-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f3231-kube-rbac-proxy-sar-config") pod "13999f24-2195-46ee-a8de-d2687ebef412" (UID: "13999f24-2195-46ee-a8de-d2687ebef412"). InnerVolumeSpecName "success-200-isvc-f3231-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:31.637801 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.637777 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13999f24-2195-46ee-a8de-d2687ebef412-kube-api-access-k9zlv" (OuterVolumeSpecName: "kube-api-access-k9zlv") pod "13999f24-2195-46ee-a8de-d2687ebef412" (UID: "13999f24-2195-46ee-a8de-d2687ebef412"). InnerVolumeSpecName "kube-api-access-k9zlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:31.637801 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.637787 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13999f24-2195-46ee-a8de-d2687ebef412-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "13999f24-2195-46ee-a8de-d2687ebef412" (UID: "13999f24-2195-46ee-a8de-d2687ebef412"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:31.737411 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.737373 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13999f24-2195-46ee-a8de-d2687ebef412-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:40:31.737411 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.737410 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9zlv\" (UniqueName: \"kubernetes.io/projected/13999f24-2195-46ee-a8de-d2687ebef412-kube-api-access-k9zlv\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:40:31.737614 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.737425 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13999f24-2195-46ee-a8de-d2687ebef412-success-200-isvc-f3231-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:40:31.818557 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.818517 2573 generic.go:358] "Generic (PLEG): container finished" podID="13999f24-2195-46ee-a8de-d2687ebef412" containerID="f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4" exitCode=0 Apr 24 21:40:31.818754 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.818588 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" event={"ID":"13999f24-2195-46ee-a8de-d2687ebef412","Type":"ContainerDied","Data":"f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4"} Apr 24 21:40:31.818754 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.818597 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" Apr 24 21:40:31.818754 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.818622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw" event={"ID":"13999f24-2195-46ee-a8de-d2687ebef412","Type":"ContainerDied","Data":"c7f1ce9675db0311288b3bb62c9f962cf35b5a6da5d438411841b11bddc99dc9"} Apr 24 21:40:31.818754 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.818642 2573 scope.go:117] "RemoveContainer" containerID="2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c" Apr 24 21:40:31.829232 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.829208 2573 scope.go:117] "RemoveContainer" containerID="f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4" Apr 24 21:40:31.837092 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.837073 2573 scope.go:117] "RemoveContainer" containerID="2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c" Apr 24 21:40:31.837365 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:31.837331 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c\": container with ID starting with 2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c not found: ID does not exist" containerID="2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c" Apr 24 21:40:31.837452 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.837381 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c"} err="failed to get container status \"2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c\": rpc error: code = NotFound desc = could not find container \"2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c\": container with ID starting with 2683894bd8e01a2c077438fcc1cf66f0f6de270736eb9cc888fced74fe322b1c not found: ID does not exist" Apr 24 21:40:31.837452 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.837408 2573 scope.go:117] "RemoveContainer" containerID="f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4" Apr 24 21:40:31.837669 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:31.837651 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4\": container with ID starting with f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4 not found: ID does not exist" containerID="f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4" Apr 24 21:40:31.837725 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.837678 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4"} err="failed to get container status \"f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4\": rpc error: code = NotFound desc = could not find container \"f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4\": container with ID starting with f9f0dd289a065cb30f370471118946370842152e0348d92eefbbabe78be638c4 not found: ID does not exist" Apr 24 21:40:31.842819 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.842793 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw"] Apr 24 21:40:31.844128 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:31.844109 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw"] Apr 24 21:40:32.211461 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:32.211428 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13999f24-2195-46ee-a8de-d2687ebef412" path="/var/lib/kubelet/pods/13999f24-2195-46ee-a8de-d2687ebef412/volumes" Apr 24 21:40:32.708769 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:32.708732 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:40:35.782300 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:35.782244 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:35.818981 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:35.818950 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:40:35.819544 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:35.819517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:40:40.781574 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:40.781532 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:40.781970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:40.781668 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:42.710148 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:42.710116 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:40:45.781938 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:45.781898 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:45.819964 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:45.819924 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:40:50.781772 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:50.781731 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:53.933178 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933140 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255"] Apr 24 21:40:53.933739 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933670 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" Apr 24 21:40:53.933739 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933692 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" Apr 24 21:40:53.933739 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933730 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kube-rbac-proxy" Apr 24 21:40:53.933739 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933738 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kube-rbac-proxy" Apr 24 21:40:53.933941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933820 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kube-rbac-proxy" Apr 24 21:40:53.933941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.933835 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="13999f24-2195-46ee-a8de-d2687ebef412" containerName="kserve-container" Apr 24 21:40:53.936946 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.936924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:53.939372 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.939337 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c1a35-kube-rbac-proxy-sar-config\"" Apr 24 21:40:53.939372 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.939342 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c1a35-serving-cert\"" Apr 24 21:40:53.944620 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:53.944599 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255"] Apr 24 21:40:54.031555 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.031518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daa90fc4-db12-4619-8ad2-1823d103548e-proxy-tls\") pod \"sequence-graph-c1a35-5b75598d54-4l255\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.031555 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.031559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa90fc4-db12-4619-8ad2-1823d103548e-openshift-service-ca-bundle\") pod \"sequence-graph-c1a35-5b75598d54-4l255\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.132737 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.132698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daa90fc4-db12-4619-8ad2-1823d103548e-proxy-tls\") pod \"sequence-graph-c1a35-5b75598d54-4l255\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.132737 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.132739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa90fc4-db12-4619-8ad2-1823d103548e-openshift-service-ca-bundle\") pod \"sequence-graph-c1a35-5b75598d54-4l255\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.133348 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.133322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa90fc4-db12-4619-8ad2-1823d103548e-openshift-service-ca-bundle\") pod \"sequence-graph-c1a35-5b75598d54-4l255\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.135179 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.135150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daa90fc4-db12-4619-8ad2-1823d103548e-proxy-tls\") pod \"sequence-graph-c1a35-5b75598d54-4l255\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.247920 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.247891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.373726 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.373701 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255"] Apr 24 21:40:54.376541 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:40:54.376505 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-b79e6d86937a536ad18742b5fb8ef30a8c239349b4ed7f3ce5463fb133fb1ee9 WatchSource:0}: Error finding container b79e6d86937a536ad18742b5fb8ef30a8c239349b4ed7f3ce5463fb133fb1ee9: Status 404 returned error can't find the container with id b79e6d86937a536ad18742b5fb8ef30a8c239349b4ed7f3ce5463fb133fb1ee9 Apr 24 21:40:54.902391 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.902339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" event={"ID":"daa90fc4-db12-4619-8ad2-1823d103548e","Type":"ContainerStarted","Data":"da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad"} Apr 24 21:40:54.902595 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.902399 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:40:54.902595 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.902414 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" event={"ID":"daa90fc4-db12-4619-8ad2-1823d103548e","Type":"ContainerStarted","Data":"b79e6d86937a536ad18742b5fb8ef30a8c239349b4ed7f3ce5463fb133fb1ee9"} Apr 24 21:40:54.921250 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:54.921201 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podStartSLOduration=1.921187295 podStartE2EDuration="1.921187295s" podCreationTimestamp="2026-04-24 21:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:54.919898527 +0000 UTC m=+1477.325739979" watchObservedRunningTime="2026-04-24 21:40:54.921187295 +0000 UTC m=+1477.327028797" Apr 24 21:40:55.781526 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:55.781487 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:55.820093 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:55.820058 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:40:58.690370 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.690328 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:58.874433 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.874324 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf598abc-362a-4194-9c7c-8f53960c1ec8-openshift-service-ca-bundle\") pod \"bf598abc-362a-4194-9c7c-8f53960c1ec8\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " Apr 24 21:40:58.874433 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.874405 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls\") pod \"bf598abc-362a-4194-9c7c-8f53960c1ec8\" (UID: \"bf598abc-362a-4194-9c7c-8f53960c1ec8\") " Apr 24 21:40:58.874688 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.874661 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf598abc-362a-4194-9c7c-8f53960c1ec8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bf598abc-362a-4194-9c7c-8f53960c1ec8" (UID: "bf598abc-362a-4194-9c7c-8f53960c1ec8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:58.876404 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.876373 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bf598abc-362a-4194-9c7c-8f53960c1ec8" (UID: "bf598abc-362a-4194-9c7c-8f53960c1ec8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:58.917068 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.917029 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerID="7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f" exitCode=0 Apr 24 21:40:58.917238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.917092 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" Apr 24 21:40:58.917238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.917114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" event={"ID":"bf598abc-362a-4194-9c7c-8f53960c1ec8","Type":"ContainerDied","Data":"7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f"} Apr 24 21:40:58.917238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.917148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2" event={"ID":"bf598abc-362a-4194-9c7c-8f53960c1ec8","Type":"ContainerDied","Data":"7420f10fda74d3df872636d392508d14c9d25c0746fb0fc2baf67f01382978a6"} Apr 24 21:40:58.917238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.917162 2573 scope.go:117] "RemoveContainer" containerID="7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f" Apr 24 21:40:58.925426 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.925406 2573 scope.go:117] "RemoveContainer" containerID="7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f" Apr 24 21:40:58.925669 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:40:58.925650 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f\": container with ID starting with 7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f not found: ID does not exist" containerID="7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f" Apr 24 21:40:58.925736 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.925677 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f"} err="failed to get container status \"7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f\": rpc error: code = NotFound desc = could not find container \"7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f\": container with ID starting with 7512aafe4c8f0a0eebfa9c64c05ed2c35e8680b6ab4e22d451dd42b4cc15631f not found: ID does not exist" Apr 24 21:40:58.937000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.936974 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2"] Apr 24 21:40:58.940939 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.940915 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2"] Apr 24 21:40:58.975823 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.975778 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf598abc-362a-4194-9c7c-8f53960c1ec8-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:40:58.975823 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:40:58.975819 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf598abc-362a-4194-9c7c-8f53960c1ec8-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:41:00.210894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:00.210856 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" path="/var/lib/kubelet/pods/bf598abc-362a-4194-9c7c-8f53960c1ec8/volumes" Apr 24 21:41:00.912839 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:00.912807 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:41:03.984701 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:03.984669 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255"] Apr 24 21:41:03.985141 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:03.984940 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" containerID="cri-o://da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad" gracePeriod=30 Apr 24 21:41:04.108760 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.108727 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk"] Apr 24 21:41:04.109095 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.109061 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" containerID="cri-o://3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269" gracePeriod=30 Apr 24 21:41:04.109251 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.109090 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kube-rbac-proxy" containerID="cri-o://159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351" gracePeriod=30 Apr 24 21:41:04.135104 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.135076 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9"] Apr 24 21:41:04.135466 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.135447 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" Apr 24 21:41:04.135466 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.135465 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" Apr 24 21:41:04.135660 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.135557 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf598abc-362a-4194-9c7c-8f53960c1ec8" containerName="ensemble-graph-f3231" Apr 24 21:41:04.140016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.139996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.142188 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.142160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-64806-predictor-serving-cert\"" Apr 24 21:41:04.142290 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.142242 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-64806-kube-rbac-proxy-sar-config\"" Apr 24 21:41:04.149082 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.149059 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9"] Apr 24 21:41:04.220216 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.220182 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-proxy-tls\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.220407 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.220292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmkd4\" (UniqueName: \"kubernetes.io/projected/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-kube-api-access-gmkd4\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.220524 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.220428 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-success-200-isvc-64806-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.321621 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.321540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-success-200-isvc-64806-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.321621 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.321594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-proxy-tls\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.321868 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.321631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmkd4\" (UniqueName: \"kubernetes.io/projected/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-kube-api-access-gmkd4\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.322229 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.322205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-success-200-isvc-64806-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.324152 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.324131 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-proxy-tls\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.329008 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.328981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmkd4\" (UniqueName: \"kubernetes.io/projected/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-kube-api-access-gmkd4\") pod \"success-200-isvc-64806-predictor-6bdc69944f-rmdz9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.452609 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.452567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.573487 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.573415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9"] Apr 24 21:41:04.576693 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:41:04.576666 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66a7a28_134b_4266_81b7_b0eb7ef91ae9.slice/crio-42fdb8644c67ed3e897ad8c7bb8ce117c18b2218d423b61c43ea676f1c82b2b6 WatchSource:0}: Error finding container 42fdb8644c67ed3e897ad8c7bb8ce117c18b2218d423b61c43ea676f1c82b2b6: Status 404 returned error can't find the container with id 42fdb8644c67ed3e897ad8c7bb8ce117c18b2218d423b61c43ea676f1c82b2b6 Apr 24 21:41:04.942564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.942475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" event={"ID":"e66a7a28-134b-4266-81b7-b0eb7ef91ae9","Type":"ContainerStarted","Data":"1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221"} Apr 24 21:41:04.942564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.942510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" event={"ID":"e66a7a28-134b-4266-81b7-b0eb7ef91ae9","Type":"ContainerStarted","Data":"beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee"} Apr 24 21:41:04.942564 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.942523 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" event={"ID":"e66a7a28-134b-4266-81b7-b0eb7ef91ae9","Type":"ContainerStarted","Data":"42fdb8644c67ed3e897ad8c7bb8ce117c18b2218d423b61c43ea676f1c82b2b6"} Apr 24 21:41:04.942829 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.942580 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:04.944044 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.944021 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerID="159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351" exitCode=2 Apr 24 21:41:04.944154 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.944050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" event={"ID":"c2b3f2bd-2603-4bff-8cdc-d7717704beab","Type":"ContainerDied","Data":"159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351"} Apr 24 21:41:04.961028 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:04.960982 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podStartSLOduration=0.960969904 podStartE2EDuration="960.969904ms" podCreationTimestamp="2026-04-24 21:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:41:04.958412712 +0000 UTC m=+1487.364254164" watchObservedRunningTime="2026-04-24 21:41:04.960969904 +0000 UTC m=+1487.366811356" Apr 24 21:41:05.819894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:05.819849 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:41:05.910628 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:05.910588 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:05.947418 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:05.947388 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:05.948798 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:05.948767 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:41:06.950822 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:06.950768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:41:07.360585 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.360559 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:41:07.450000 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.449970 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls\") pod \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " Apr 24 21:41:07.450155 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.450028 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2b3f2bd-2603-4bff-8cdc-d7717704beab-success-200-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " Apr 24 21:41:07.450155 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.450138 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2gjl\" (UniqueName: \"kubernetes.io/projected/c2b3f2bd-2603-4bff-8cdc-d7717704beab-kube-api-access-r2gjl\") pod \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\" (UID: \"c2b3f2bd-2603-4bff-8cdc-d7717704beab\") " Apr 24 21:41:07.450445 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.450417 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b3f2bd-2603-4bff-8cdc-d7717704beab-success-200-isvc-c1a35-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-c1a35-kube-rbac-proxy-sar-config") pod "c2b3f2bd-2603-4bff-8cdc-d7717704beab" (UID: "c2b3f2bd-2603-4bff-8cdc-d7717704beab"). InnerVolumeSpecName "success-200-isvc-c1a35-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:07.452173 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.452105 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c2b3f2bd-2603-4bff-8cdc-d7717704beab" (UID: "c2b3f2bd-2603-4bff-8cdc-d7717704beab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:07.452173 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.452105 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b3f2bd-2603-4bff-8cdc-d7717704beab-kube-api-access-r2gjl" (OuterVolumeSpecName: "kube-api-access-r2gjl") pod "c2b3f2bd-2603-4bff-8cdc-d7717704beab" (UID: "c2b3f2bd-2603-4bff-8cdc-d7717704beab"). InnerVolumeSpecName "kube-api-access-r2gjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:07.551650 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.551618 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2b3f2bd-2603-4bff-8cdc-d7717704beab-success-200-isvc-c1a35-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:41:07.551650 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.551645 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2gjl\" (UniqueName: \"kubernetes.io/projected/c2b3f2bd-2603-4bff-8cdc-d7717704beab-kube-api-access-r2gjl\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:41:07.551650 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.551655 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2b3f2bd-2603-4bff-8cdc-d7717704beab-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:41:07.954401 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.954367 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerID="3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269" exitCode=0 Apr 24 21:41:07.954401 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.954407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" event={"ID":"c2b3f2bd-2603-4bff-8cdc-d7717704beab","Type":"ContainerDied","Data":"3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269"} Apr 24 21:41:07.954870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.954429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" event={"ID":"c2b3f2bd-2603-4bff-8cdc-d7717704beab","Type":"ContainerDied","Data":"dcaeb827984f48aa4f2da22f755e4340004c29b1704cde5150bda9427611c25c"} Apr 24 21:41:07.954870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.954439 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk" Apr 24 21:41:07.954870 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.954445 2573 scope.go:117] "RemoveContainer" containerID="159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351" Apr 24 21:41:07.963006 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.962988 2573 scope.go:117] "RemoveContainer" containerID="3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269" Apr 24 21:41:07.969982 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.969965 2573 scope.go:117] "RemoveContainer" containerID="159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351" Apr 24 21:41:07.970213 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:41:07.970189 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351\": container with ID starting with 159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351 not found: ID does not exist" containerID="159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351" Apr 24 21:41:07.970274 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.970225 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351"} err="failed to get container status \"159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351\": rpc error: code = NotFound desc = could not find container \"159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351\": container with ID starting with 159b2494b78dfcd6bd5a312ceef053b9d2406023e2a08341b12f4513de8ea351 not found: ID does not exist" Apr 24 21:41:07.970274 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.970248 2573 scope.go:117] "RemoveContainer" containerID="3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269" Apr 24 21:41:07.970489 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:41:07.970469 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269\": container with ID starting with 3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269 not found: ID does not exist" containerID="3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269" Apr 24 21:41:07.970535 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.970495 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269"} err="failed to get container status \"3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269\": rpc error: code = NotFound desc = could not find container \"3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269\": container with ID starting with 3ff15ead7c34246d126d5d991c274c29451235d672dce97533cda03b84d4c269 not found: ID does not exist" Apr 24 21:41:07.986186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:07.986159 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk"] Apr 24 21:41:08.003016 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:08.002992 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk"] Apr 24 21:41:08.212194 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:08.212117 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" path="/var/lib/kubelet/pods/c2b3f2bd-2603-4bff-8cdc-d7717704beab/volumes" Apr 24 21:41:10.910337 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:10.910297 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:11.955157 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:11.955130 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:41:11.955765 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:11.955736 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:41:15.820521 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:15.820484 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:41:15.910808 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:15.910770 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:15.910951 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:15.910877 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:41:18.166272 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:18.166244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:41:18.168638 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:18.168613 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:41:20.910867 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:20.910826 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:21.956613 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:21.956563 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:41:25.910025 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:25.909988 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:28.229408 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229371 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm"] Apr 24 21:41:28.229835 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229692 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" Apr 24 21:41:28.229835 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229703 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" Apr 24 21:41:28.229835 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229726 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kube-rbac-proxy" Apr 24 21:41:28.229835 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229732 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kube-rbac-proxy" Apr 24 21:41:28.229835 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229784 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kserve-container" Apr 24 21:41:28.229835 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.229794 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2b3f2bd-2603-4bff-8cdc-d7717704beab" containerName="kube-rbac-proxy" Apr 24 21:41:28.234387 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.234345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.237026 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.236937 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-75b0d-serving-cert\"" Apr 24 21:41:28.237152 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.237043 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-75b0d-kube-rbac-proxy-sar-config\"" Apr 24 21:41:28.239170 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.239144 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm"] Apr 24 21:41:28.329311 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.329276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee921bcc-b5c8-44b2-b035-839e57c39143-proxy-tls\") pod \"ensemble-graph-75b0d-699b59b6f-snfrm\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.329497 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.329348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee921bcc-b5c8-44b2-b035-839e57c39143-openshift-service-ca-bundle\") pod \"ensemble-graph-75b0d-699b59b6f-snfrm\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.430399 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.430350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee921bcc-b5c8-44b2-b035-839e57c39143-proxy-tls\") pod \"ensemble-graph-75b0d-699b59b6f-snfrm\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.430572 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.430416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee921bcc-b5c8-44b2-b035-839e57c39143-openshift-service-ca-bundle\") pod \"ensemble-graph-75b0d-699b59b6f-snfrm\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.431023 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.431006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee921bcc-b5c8-44b2-b035-839e57c39143-openshift-service-ca-bundle\") pod \"ensemble-graph-75b0d-699b59b6f-snfrm\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.432675 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.432654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee921bcc-b5c8-44b2-b035-839e57c39143-proxy-tls\") pod \"ensemble-graph-75b0d-699b59b6f-snfrm\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.546428 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.546310 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:28.667074 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:28.667038 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm"] Apr 24 21:41:29.022414 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:29.022375 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" event={"ID":"ee921bcc-b5c8-44b2-b035-839e57c39143","Type":"ContainerStarted","Data":"20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24"} Apr 24 21:41:29.022414 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:29.022416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" event={"ID":"ee921bcc-b5c8-44b2-b035-839e57c39143","Type":"ContainerStarted","Data":"601b2fec2abc79bd60aa664100d3fbd04215bd33d3cfc09dd1a1ef032f902ca2"} Apr 24 21:41:29.022668 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:29.022514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:29.037861 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:29.037809 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podStartSLOduration=1.037787236 podStartE2EDuration="1.037787236s" podCreationTimestamp="2026-04-24 21:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:41:29.036952887 +0000 UTC m=+1511.442794341" watchObservedRunningTime="2026-04-24 21:41:29.037787236 +0000 UTC m=+1511.443628689" Apr 24 21:41:30.910304 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:30.910266 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:31.956037 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:31.955994 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:41:34.010771 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:41:34.010737 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-conmon-da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:41:34.011069 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:41:34.010784 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-conmon-da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:41:34.011069 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:41:34.010737 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa90fc4_db12_4619_8ad2_1823d103548e.slice/crio-conmon-da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:41:34.040379 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.040328 2573 generic.go:358] "Generic (PLEG): container finished" podID="daa90fc4-db12-4619-8ad2-1823d103548e" containerID="da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad" exitCode=0 Apr 24 21:41:34.040547 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.040404 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" event={"ID":"daa90fc4-db12-4619-8ad2-1823d103548e","Type":"ContainerDied","Data":"da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad"} Apr 24 21:41:34.137563 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.137540 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:41:34.286629 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.286508 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daa90fc4-db12-4619-8ad2-1823d103548e-proxy-tls\") pod \"daa90fc4-db12-4619-8ad2-1823d103548e\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " Apr 24 21:41:34.286629 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.286579 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa90fc4-db12-4619-8ad2-1823d103548e-openshift-service-ca-bundle\") pod \"daa90fc4-db12-4619-8ad2-1823d103548e\" (UID: \"daa90fc4-db12-4619-8ad2-1823d103548e\") " Apr 24 21:41:34.286997 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.286972 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa90fc4-db12-4619-8ad2-1823d103548e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "daa90fc4-db12-4619-8ad2-1823d103548e" (UID: "daa90fc4-db12-4619-8ad2-1823d103548e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:34.288761 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.288738 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa90fc4-db12-4619-8ad2-1823d103548e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "daa90fc4-db12-4619-8ad2-1823d103548e" (UID: "daa90fc4-db12-4619-8ad2-1823d103548e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:34.387381 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.387316 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daa90fc4-db12-4619-8ad2-1823d103548e-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:41:34.387381 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:34.387368 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa90fc4-db12-4619-8ad2-1823d103548e-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:41:35.032792 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:35.032762 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:41:35.044748 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:35.044718 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" event={"ID":"daa90fc4-db12-4619-8ad2-1823d103548e","Type":"ContainerDied","Data":"b79e6d86937a536ad18742b5fb8ef30a8c239349b4ed7f3ce5463fb133fb1ee9"} Apr 24 21:41:35.044748 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:35.044739 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255" Apr 24 21:41:35.044960 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:35.044759 2573 scope.go:117] "RemoveContainer" containerID="da4dd831d5bc710470374409bafbdffb8aed2bd7dd786d62e3e47684378babad" Apr 24 21:41:35.066149 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:35.066121 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255"] Apr 24 21:41:35.072118 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:35.072085 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255"] Apr 24 21:41:36.210900 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:36.210863 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" path="/var/lib/kubelet/pods/daa90fc4-db12-4619-8ad2-1823d103548e/volumes" Apr 24 21:41:41.955970 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:41.955927 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:41:51.956536 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:41:51.956501 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:42:04.177118 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.177083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4"] Apr 24 21:42:04.177578 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.177463 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" Apr 24 21:42:04.177578 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.177476 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" Apr 24 21:42:04.177578 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.177545 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="daa90fc4-db12-4619-8ad2-1823d103548e" containerName="sequence-graph-c1a35" Apr 24 21:42:04.180615 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.180595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.182776 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.182753 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-64806-kube-rbac-proxy-sar-config\"" Apr 24 21:42:04.182872 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.182791 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-64806-serving-cert\"" Apr 24 21:42:04.189846 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.189827 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4"] Apr 24 21:42:04.241723 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.241687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.241874 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.241840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d311244-c2f3-4350-a0d4-51588e56c67c-openshift-service-ca-bundle\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.342242 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.342200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.342448 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.342318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d311244-c2f3-4350-a0d4-51588e56c67c-openshift-service-ca-bundle\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.342448 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:42:04.342381 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-64806-serving-cert: secret "sequence-graph-64806-serving-cert" not found Apr 24 21:42:04.342528 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:42:04.342454 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls podName:2d311244-c2f3-4350-a0d4-51588e56c67c nodeName:}" failed. No retries permitted until 2026-04-24 21:42:04.842436513 +0000 UTC m=+1547.248277944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls") pod "sequence-graph-64806-67786fdb9f-r75d4" (UID: "2d311244-c2f3-4350-a0d4-51588e56c67c") : secret "sequence-graph-64806-serving-cert" not found Apr 24 21:42:04.342915 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.342897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d311244-c2f3-4350-a0d4-51588e56c67c-openshift-service-ca-bundle\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.847208 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.847170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:04.849726 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:04.849694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls\") pod \"sequence-graph-64806-67786fdb9f-r75d4\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:05.091672 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:05.091634 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:05.216881 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:05.216734 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4"] Apr 24 21:42:05.219115 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:42:05.219084 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d311244_c2f3_4350_a0d4_51588e56c67c.slice/crio-5ada125b317d1c0e4af39888fddb4ed365ef220fb6670e4a66dd191ea6725275 WatchSource:0}: Error finding container 5ada125b317d1c0e4af39888fddb4ed365ef220fb6670e4a66dd191ea6725275: Status 404 returned error can't find the container with id 5ada125b317d1c0e4af39888fddb4ed365ef220fb6670e4a66dd191ea6725275 Apr 24 21:42:06.153748 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:06.153713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" event={"ID":"2d311244-c2f3-4350-a0d4-51588e56c67c","Type":"ContainerStarted","Data":"547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7"} Apr 24 21:42:06.153748 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:06.153750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" event={"ID":"2d311244-c2f3-4350-a0d4-51588e56c67c","Type":"ContainerStarted","Data":"5ada125b317d1c0e4af39888fddb4ed365ef220fb6670e4a66dd191ea6725275"} Apr 24 21:42:06.153962 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:06.153835 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:42:06.170090 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:06.170040 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podStartSLOduration=2.170025084 podStartE2EDuration="2.170025084s" podCreationTimestamp="2026-04-24 21:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:42:06.168497916 +0000 UTC m=+1548.574339368" watchObservedRunningTime="2026-04-24 21:42:06.170025084 +0000 UTC m=+1548.575866534" Apr 24 21:42:12.162630 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:42:12.162600 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:46:18.190862 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:46:18.190832 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:46:18.192881 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:46:18.192856 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:49:42.913596 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:42.913561 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm"] Apr 24 21:49:42.916095 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:42.913866 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" containerID="cri-o://20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24" gracePeriod=30 Apr 24 21:49:43.015686 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.015651 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr"] Apr 24 21:49:43.016033 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.016002 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" containerID="cri-o://2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8" gracePeriod=30 Apr 24 21:49:43.016143 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.016075 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kube-rbac-proxy" containerID="cri-o://d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563" gracePeriod=30 Apr 24 21:49:43.105628 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.105593 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f"] Apr 24 21:49:43.108992 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.108973 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.111229 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.111208 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-4efea-predictor-serving-cert\"" Apr 24 21:49:43.111401 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.111255 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-4efea-kube-rbac-proxy-sar-config\"" Apr 24 21:49:43.125681 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.125656 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f"] Apr 24 21:49:43.223021 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.222987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pnvs\" (UniqueName: \"kubernetes.io/projected/b77acea4-fae8-402c-b34d-8ff4efaa4d78-kube-api-access-5pnvs\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.223193 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.223041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b77acea4-fae8-402c-b34d-8ff4efaa4d78-success-200-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.223193 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.223100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b77acea4-fae8-402c-b34d-8ff4efaa4d78-proxy-tls\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.323826 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.323768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b77acea4-fae8-402c-b34d-8ff4efaa4d78-proxy-tls\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.323991 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.323871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pnvs\" (UniqueName: \"kubernetes.io/projected/b77acea4-fae8-402c-b34d-8ff4efaa4d78-kube-api-access-5pnvs\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.323991 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.323900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b77acea4-fae8-402c-b34d-8ff4efaa4d78-success-200-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.324543 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.324523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b77acea4-fae8-402c-b34d-8ff4efaa4d78-success-200-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.326215 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.326189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b77acea4-fae8-402c-b34d-8ff4efaa4d78-proxy-tls\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.333184 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.333158 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pnvs\" (UniqueName: \"kubernetes.io/projected/b77acea4-fae8-402c-b34d-8ff4efaa4d78-kube-api-access-5pnvs\") pod \"success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.420238 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.420182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:43.547055 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.547023 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f"] Apr 24 21:49:43.551085 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:49:43.551058 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77acea4_fae8_402c_b34d_8ff4efaa4d78.slice/crio-3125ae6e9754b0eed166447eb50f6419edf90efe9b35405bde1b8581f9811258 WatchSource:0}: Error finding container 3125ae6e9754b0eed166447eb50f6419edf90efe9b35405bde1b8581f9811258: Status 404 returned error can't find the container with id 3125ae6e9754b0eed166447eb50f6419edf90efe9b35405bde1b8581f9811258 Apr 24 21:49:43.552697 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.552674 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:49:43.629620 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.629595 2573 generic.go:358] "Generic (PLEG): container finished" podID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerID="d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563" exitCode=2 Apr 24 21:49:43.629700 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.629652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" event={"ID":"d6580f8e-1f40-4415-b2d8-36099f4708a8","Type":"ContainerDied","Data":"d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563"} Apr 24 21:49:43.631014 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.630994 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" event={"ID":"b77acea4-fae8-402c-b34d-8ff4efaa4d78","Type":"ContainerStarted","Data":"8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3"} Apr 24 21:49:43.631106 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:43.631021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" event={"ID":"b77acea4-fae8-402c-b34d-8ff4efaa4d78","Type":"ContainerStarted","Data":"3125ae6e9754b0eed166447eb50f6419edf90efe9b35405bde1b8581f9811258"} Apr 24 21:49:44.635840 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:44.635793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" event={"ID":"b77acea4-fae8-402c-b34d-8ff4efaa4d78","Type":"ContainerStarted","Data":"b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c"} Apr 24 21:49:44.636194 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:44.635893 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:44.655404 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:44.655331 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podStartSLOduration=1.655318182 podStartE2EDuration="1.655318182s" podCreationTimestamp="2026-04-24 21:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:44.653457872 +0000 UTC m=+2007.059299326" watchObservedRunningTime="2026-04-24 21:49:44.655318182 +0000 UTC m=+2007.061159634" Apr 24 21:49:45.030192 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:45.030151 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:45.640012 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:45.639983 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:45.641333 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:45.641305 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 21:49:45.815151 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:45.815105 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 21:49:45.820076 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:45.820042 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:49:46.176724 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.176702 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:49:46.351186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.351092 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gngqj\" (UniqueName: \"kubernetes.io/projected/d6580f8e-1f40-4415-b2d8-36099f4708a8-kube-api-access-gngqj\") pod \"d6580f8e-1f40-4415-b2d8-36099f4708a8\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " Apr 24 21:49:46.351186 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.351184 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d6580f8e-1f40-4415-b2d8-36099f4708a8-success-200-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"d6580f8e-1f40-4415-b2d8-36099f4708a8\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " Apr 24 21:49:46.351397 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.351209 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls\") pod \"d6580f8e-1f40-4415-b2d8-36099f4708a8\" (UID: \"d6580f8e-1f40-4415-b2d8-36099f4708a8\") " Apr 24 21:49:46.351672 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.351628 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6580f8e-1f40-4415-b2d8-36099f4708a8-success-200-isvc-75b0d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-75b0d-kube-rbac-proxy-sar-config") pod "d6580f8e-1f40-4415-b2d8-36099f4708a8" (UID: "d6580f8e-1f40-4415-b2d8-36099f4708a8"). InnerVolumeSpecName "success-200-isvc-75b0d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:46.353776 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.353747 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d6580f8e-1f40-4415-b2d8-36099f4708a8" (UID: "d6580f8e-1f40-4415-b2d8-36099f4708a8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:46.353928 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.353812 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6580f8e-1f40-4415-b2d8-36099f4708a8-kube-api-access-gngqj" (OuterVolumeSpecName: "kube-api-access-gngqj") pod "d6580f8e-1f40-4415-b2d8-36099f4708a8" (UID: "d6580f8e-1f40-4415-b2d8-36099f4708a8"). InnerVolumeSpecName "kube-api-access-gngqj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:46.451946 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.451887 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d6580f8e-1f40-4415-b2d8-36099f4708a8-success-200-isvc-75b0d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:49:46.451946 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.451939 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6580f8e-1f40-4415-b2d8-36099f4708a8-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:49:46.451946 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.451952 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gngqj\" (UniqueName: \"kubernetes.io/projected/d6580f8e-1f40-4415-b2d8-36099f4708a8-kube-api-access-gngqj\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:49:46.644424 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.644320 2573 generic.go:358] "Generic (PLEG): container finished" podID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerID="2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8" exitCode=0 Apr 24 21:49:46.644857 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.644416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" event={"ID":"d6580f8e-1f40-4415-b2d8-36099f4708a8","Type":"ContainerDied","Data":"2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8"} Apr 24 21:49:46.644857 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.644450 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" Apr 24 21:49:46.644857 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.644467 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr" event={"ID":"d6580f8e-1f40-4415-b2d8-36099f4708a8","Type":"ContainerDied","Data":"af1bcc6726172dbdf72f9fc3dd023e50ededc1ed44b981d6d34e29c44bff4b1d"} Apr 24 21:49:46.644857 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.644491 2573 scope.go:117] "RemoveContainer" containerID="d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563" Apr 24 21:49:46.645057 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.644963 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 21:49:46.653515 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.653496 2573 scope.go:117] "RemoveContainer" containerID="2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8" Apr 24 21:49:46.661078 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.661058 2573 scope.go:117] "RemoveContainer" containerID="d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563" Apr 24 21:49:46.661423 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:49:46.661341 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563\": container with ID starting with d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563 not found: ID does not exist" containerID="d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563" Apr 24 21:49:46.661423 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.661385 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563"} err="failed to get container status \"d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563\": rpc error: code = NotFound desc = could not find container \"d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563\": container with ID starting with d8c9cd7763d391ddeeafb5b2319dac2623b94b1e9f0dac8aa6aca25fc09ba563 not found: ID does not exist" Apr 24 21:49:46.661423 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.661403 2573 scope.go:117] "RemoveContainer" containerID="2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8" Apr 24 21:49:46.661643 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:49:46.661625 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8\": container with ID starting with 2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8 not found: ID does not exist" containerID="2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8" Apr 24 21:49:46.661679 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.661650 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8"} err="failed to get container status \"2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8\": rpc error: code = NotFound desc = could not find container \"2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8\": container with ID starting with 2b827d498b471e09f2325cad27c991615635610019a64885b86ba82958eb52c8 not found: ID does not exist" Apr 24 21:49:46.665042 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.665015 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr"] Apr 24 21:49:46.668438 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:46.668415 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr"] Apr 24 21:49:48.212144 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:48.212109 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" path="/var/lib/kubelet/pods/d6580f8e-1f40-4415-b2d8-36099f4708a8/volumes" Apr 24 21:49:50.030633 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:50.030592 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:51.649617 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:51.649585 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:49:51.650222 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:51.650192 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 21:49:55.029807 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:55.029768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:55.030232 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:49:55.029872 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:50:00.030714 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:00.030673 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:01.650856 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:01.650813 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 21:50:05.030439 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:05.030393 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:10.029941 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:10.029900 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:11.650316 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:11.650277 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 21:50:13.103444 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.103418 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:50:13.163654 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.163617 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee921bcc-b5c8-44b2-b035-839e57c39143-proxy-tls\") pod \"ee921bcc-b5c8-44b2-b035-839e57c39143\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " Apr 24 21:50:13.163654 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.163657 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee921bcc-b5c8-44b2-b035-839e57c39143-openshift-service-ca-bundle\") pod \"ee921bcc-b5c8-44b2-b035-839e57c39143\" (UID: \"ee921bcc-b5c8-44b2-b035-839e57c39143\") " Apr 24 21:50:13.164089 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.164061 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee921bcc-b5c8-44b2-b035-839e57c39143-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ee921bcc-b5c8-44b2-b035-839e57c39143" (UID: "ee921bcc-b5c8-44b2-b035-839e57c39143"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:13.165783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.165760 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee921bcc-b5c8-44b2-b035-839e57c39143-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ee921bcc-b5c8-44b2-b035-839e57c39143" (UID: "ee921bcc-b5c8-44b2-b035-839e57c39143"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:13.264685 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.264631 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee921bcc-b5c8-44b2-b035-839e57c39143-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.264685 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.264679 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee921bcc-b5c8-44b2-b035-839e57c39143-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.734931 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.734893 2573 generic.go:358] "Generic (PLEG): container finished" podID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerID="20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24" exitCode=0 Apr 24 21:50:13.735099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.734959 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" Apr 24 21:50:13.735099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.734962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" event={"ID":"ee921bcc-b5c8-44b2-b035-839e57c39143","Type":"ContainerDied","Data":"20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24"} Apr 24 21:50:13.735099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.735055 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm" event={"ID":"ee921bcc-b5c8-44b2-b035-839e57c39143","Type":"ContainerDied","Data":"601b2fec2abc79bd60aa664100d3fbd04215bd33d3cfc09dd1a1ef032f902ca2"} Apr 24 21:50:13.735099 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.735072 2573 scope.go:117] "RemoveContainer" containerID="20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24" Apr 24 21:50:13.744916 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.744889 2573 scope.go:117] "RemoveContainer" containerID="20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24" Apr 24 21:50:13.745241 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:13.745198 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24\": container with ID starting with 20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24 not found: ID does not exist" containerID="20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24" Apr 24 21:50:13.745325 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.745233 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24"} err="failed to get container status \"20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24\": rpc error: code = NotFound desc = could not find container \"20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24\": container with ID starting with 20a07c4f0b355e0195adaf12894abd6f18b117a54fbaa8ff6ebee0084c5d4d24 not found: ID does not exist" Apr 24 21:50:13.759851 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.759783 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm"] Apr 24 21:50:13.761562 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:13.761539 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm"] Apr 24 21:50:14.210783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:14.210700 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" path="/var/lib/kubelet/pods/ee921bcc-b5c8-44b2-b035-839e57c39143/volumes" Apr 24 21:50:18.866508 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:18.866468 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4"] Apr 24 21:50:18.867072 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:18.866702 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" containerID="cri-o://547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7" gracePeriod=30 Apr 24 21:50:18.994415 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:18.994382 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9"] Apr 24 21:50:18.994705 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:18.994675 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" containerID="cri-o://beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee" gracePeriod=30 Apr 24 21:50:18.994930 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:18.994749 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kube-rbac-proxy" containerID="cri-o://1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221" gracePeriod=30 Apr 24 21:50:19.022738 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.022703 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl"] Apr 24 21:50:19.023093 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023079 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kube-rbac-proxy" Apr 24 21:50:19.023136 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023095 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kube-rbac-proxy" Apr 24 21:50:19.023136 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023103 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" Apr 24 21:50:19.023136 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023109 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" Apr 24 21:50:19.023136 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023119 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" Apr 24 21:50:19.023136 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023125 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" Apr 24 21:50:19.023284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023185 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee921bcc-b5c8-44b2-b035-839e57c39143" containerName="ensemble-graph-75b0d" Apr 24 21:50:19.023284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023195 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kserve-container" Apr 24 21:50:19.023284 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.023203 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6580f8e-1f40-4415-b2d8-36099f4708a8" containerName="kube-rbac-proxy" Apr 24 21:50:19.027638 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.027616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.030052 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.030025 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-98aac-kube-rbac-proxy-sar-config\"" Apr 24 21:50:19.030182 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.030024 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-98aac-predictor-serving-cert\"" Apr 24 21:50:19.039437 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.039411 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl"] Apr 24 21:50:19.114720 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.114688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qgz\" (UniqueName: \"kubernetes.io/projected/88b92159-1eab-4488-b61c-80aceb69ca9e-kube-api-access-l9qgz\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.114873 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.114747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.114873 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.114798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88b92159-1eab-4488-b61c-80aceb69ca9e-success-200-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.215421 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.215312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qgz\" (UniqueName: \"kubernetes.io/projected/88b92159-1eab-4488-b61c-80aceb69ca9e-kube-api-access-l9qgz\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.215421 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.215388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.215668 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.215433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88b92159-1eab-4488-b61c-80aceb69ca9e-success-200-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.215668 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:19.215540 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-98aac-predictor-serving-cert: secret "success-200-isvc-98aac-predictor-serving-cert" not found Apr 24 21:50:19.215668 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:19.215612 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls podName:88b92159-1eab-4488-b61c-80aceb69ca9e nodeName:}" failed. No retries permitted until 2026-04-24 21:50:19.715592123 +0000 UTC m=+2042.121433553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls") pod "success-200-isvc-98aac-predictor-7df4fb9989-xwljl" (UID: "88b92159-1eab-4488-b61c-80aceb69ca9e") : secret "success-200-isvc-98aac-predictor-serving-cert" not found Apr 24 21:50:19.216051 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.216033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88b92159-1eab-4488-b61c-80aceb69ca9e-success-200-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.224249 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.224212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qgz\" (UniqueName: \"kubernetes.io/projected/88b92159-1eab-4488-b61c-80aceb69ca9e-kube-api-access-l9qgz\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.721697 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.721657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.724204 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.724178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls\") pod \"success-200-isvc-98aac-predictor-7df4fb9989-xwljl\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:19.758179 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.758147 2573 generic.go:358] "Generic (PLEG): container finished" podID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerID="1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221" exitCode=2 Apr 24 21:50:19.758341 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.758211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" event={"ID":"e66a7a28-134b-4266-81b7-b0eb7ef91ae9","Type":"ContainerDied","Data":"1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221"} Apr 24 21:50:19.939670 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:19.939623 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:20.066054 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:20.066026 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl"] Apr 24 21:50:20.067945 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:50:20.067914 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b92159_1eab_4488_b61c_80aceb69ca9e.slice/crio-130d1626e42580f7ea9ec72de540b8814ebc3564fab6c9684aaa812911590ee3 WatchSource:0}: Error finding container 130d1626e42580f7ea9ec72de540b8814ebc3564fab6c9684aaa812911590ee3: Status 404 returned error can't find the container with id 130d1626e42580f7ea9ec72de540b8814ebc3564fab6c9684aaa812911590ee3 Apr 24 21:50:20.763078 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:20.763039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" event={"ID":"88b92159-1eab-4488-b61c-80aceb69ca9e","Type":"ContainerStarted","Data":"f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77"} Apr 24 21:50:20.763078 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:20.763081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" event={"ID":"88b92159-1eab-4488-b61c-80aceb69ca9e","Type":"ContainerStarted","Data":"1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6"} Apr 24 21:50:20.763294 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:20.763091 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" event={"ID":"88b92159-1eab-4488-b61c-80aceb69ca9e","Type":"ContainerStarted","Data":"130d1626e42580f7ea9ec72de540b8814ebc3564fab6c9684aaa812911590ee3"} Apr 24 21:50:20.763294 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:20.763119 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:20.780163 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:20.780114 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podStartSLOduration=1.78010058 podStartE2EDuration="1.78010058s" podCreationTimestamp="2026-04-24 21:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:20.77922395 +0000 UTC m=+2043.185065404" watchObservedRunningTime="2026-04-24 21:50:20.78010058 +0000 UTC m=+2043.185942032" Apr 24 21:50:21.650467 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:21.650424 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 21:50:21.766309 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:21.766274 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:21.767438 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:21.767412 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 21:50:21.951115 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:21.951010 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:50:21.956183 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:21.956157 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:50:22.151621 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.151596 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:50:22.160312 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.160278 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:22.243204 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.243173 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmkd4\" (UniqueName: \"kubernetes.io/projected/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-kube-api-access-gmkd4\") pod \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " Apr 24 21:50:22.243384 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.243264 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-success-200-isvc-64806-kube-rbac-proxy-sar-config\") pod \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " Apr 24 21:50:22.243384 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.243293 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-proxy-tls\") pod \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\" (UID: \"e66a7a28-134b-4266-81b7-b0eb7ef91ae9\") " Apr 24 21:50:22.243629 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.243607 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-success-200-isvc-64806-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-64806-kube-rbac-proxy-sar-config") pod "e66a7a28-134b-4266-81b7-b0eb7ef91ae9" (UID: "e66a7a28-134b-4266-81b7-b0eb7ef91ae9"). InnerVolumeSpecName "success-200-isvc-64806-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:22.245370 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.245338 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e66a7a28-134b-4266-81b7-b0eb7ef91ae9" (UID: "e66a7a28-134b-4266-81b7-b0eb7ef91ae9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:22.245467 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.245446 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-kube-api-access-gmkd4" (OuterVolumeSpecName: "kube-api-access-gmkd4") pod "e66a7a28-134b-4266-81b7-b0eb7ef91ae9" (UID: "e66a7a28-134b-4266-81b7-b0eb7ef91ae9"). InnerVolumeSpecName "kube-api-access-gmkd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:22.344202 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.344166 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:22.344202 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.344198 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmkd4\" (UniqueName: \"kubernetes.io/projected/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-kube-api-access-gmkd4\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:22.344202 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.344210 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e66a7a28-134b-4266-81b7-b0eb7ef91ae9-success-200-isvc-64806-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:22.770196 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.770165 2573 generic.go:358] "Generic (PLEG): container finished" podID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerID="beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee" exitCode=0 Apr 24 21:50:22.770764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.770235 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" Apr 24 21:50:22.770764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.770254 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" event={"ID":"e66a7a28-134b-4266-81b7-b0eb7ef91ae9","Type":"ContainerDied","Data":"beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee"} Apr 24 21:50:22.770764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.770312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9" event={"ID":"e66a7a28-134b-4266-81b7-b0eb7ef91ae9","Type":"ContainerDied","Data":"42fdb8644c67ed3e897ad8c7bb8ce117c18b2218d423b61c43ea676f1c82b2b6"} Apr 24 21:50:22.770764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.770337 2573 scope.go:117] "RemoveContainer" containerID="1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221" Apr 24 21:50:22.770764 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.770739 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 21:50:22.778983 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.778967 2573 scope.go:117] "RemoveContainer" containerID="beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee" Apr 24 21:50:22.785799 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.785782 2573 scope.go:117] "RemoveContainer" containerID="1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221" Apr 24 21:50:22.786054 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:22.786036 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221\": container with ID starting with 1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221 not found: ID does not exist" containerID="1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221" Apr 24 21:50:22.786103 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.786063 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221"} err="failed to get container status \"1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221\": rpc error: code = NotFound desc = could not find container \"1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221\": container with ID starting with 1fd7f544535a52bc4fd5bc8dfae50d47148058dc86a5365ae6d58cd2829c6221 not found: ID does not exist" Apr 24 21:50:22.786103 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.786080 2573 scope.go:117] "RemoveContainer" containerID="beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee" Apr 24 21:50:22.786306 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:22.786286 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee\": container with ID starting with beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee not found: ID does not exist" containerID="beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee" Apr 24 21:50:22.786351 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.786314 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee"} err="failed to get container status \"beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee\": rpc error: code = NotFound desc = could not find container \"beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee\": container with ID starting with beeec34728626ab7a3fa62b6b9448df23d6e70c07a84da0464b8cfe67c6a0bee not found: ID does not exist" Apr 24 21:50:22.792802 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.792775 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9"] Apr 24 21:50:22.796063 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:22.796043 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9"] Apr 24 21:50:24.211413 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:24.211341 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" path="/var/lib/kubelet/pods/e66a7a28-134b-4266-81b7-b0eb7ef91ae9/volumes" Apr 24 21:50:27.161041 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:27.160998 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:27.776074 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:27.776046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:50:27.776569 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:27.776542 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 21:50:31.650961 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:31.650934 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:50:32.160836 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:32.160789 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:32.161018 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:32.160898 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:50:37.161503 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:37.161414 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:37.777443 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:37.777394 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 21:50:42.160899 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:42.160853 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:43.132500 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132462 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89"] Apr 24 21:50:43.132843 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132830 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kube-rbac-proxy" Apr 24 21:50:43.132894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132845 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kube-rbac-proxy" Apr 24 21:50:43.132894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132856 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" Apr 24 21:50:43.132894 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132862 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" Apr 24 21:50:43.132990 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132925 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kube-rbac-proxy" Apr 24 21:50:43.132990 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.132935 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e66a7a28-134b-4266-81b7-b0eb7ef91ae9" containerName="kserve-container" Apr 24 21:50:43.137259 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.137235 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.139575 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.139553 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-4efea-kube-rbac-proxy-sar-config\"" Apr 24 21:50:43.139690 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.139572 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-4efea-serving-cert\"" Apr 24 21:50:43.147229 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.147199 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89"] Apr 24 21:50:43.326581 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.326528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b374bd5-2f71-4887-b183-48690a08cf17-openshift-service-ca-bundle\") pod \"splitter-graph-4efea-5bb974bbff-2dr89\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.326581 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.326587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b374bd5-2f71-4887-b183-48690a08cf17-proxy-tls\") pod \"splitter-graph-4efea-5bb974bbff-2dr89\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.427060 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.426954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b374bd5-2f71-4887-b183-48690a08cf17-openshift-service-ca-bundle\") pod \"splitter-graph-4efea-5bb974bbff-2dr89\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.427060 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.427010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b374bd5-2f71-4887-b183-48690a08cf17-proxy-tls\") pod \"splitter-graph-4efea-5bb974bbff-2dr89\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.427659 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.427635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b374bd5-2f71-4887-b183-48690a08cf17-openshift-service-ca-bundle\") pod \"splitter-graph-4efea-5bb974bbff-2dr89\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.429561 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.429540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b374bd5-2f71-4887-b183-48690a08cf17-proxy-tls\") pod \"splitter-graph-4efea-5bb974bbff-2dr89\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.447897 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.447860 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.577371 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.577308 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89"] Apr 24 21:50:43.579762 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:50:43.579721 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b374bd5_2f71_4887_b183_48690a08cf17.slice/crio-138ab2f988280bc0c6b76c1bcb2bf1c05117df9719e46ad39c04fd1e2b643c89 WatchSource:0}: Error finding container 138ab2f988280bc0c6b76c1bcb2bf1c05117df9719e46ad39c04fd1e2b643c89: Status 404 returned error can't find the container with id 138ab2f988280bc0c6b76c1bcb2bf1c05117df9719e46ad39c04fd1e2b643c89 Apr 24 21:50:43.839825 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.839782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" event={"ID":"7b374bd5-2f71-4887-b183-48690a08cf17","Type":"ContainerStarted","Data":"99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba"} Apr 24 21:50:43.839825 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.839828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" event={"ID":"7b374bd5-2f71-4887-b183-48690a08cf17","Type":"ContainerStarted","Data":"138ab2f988280bc0c6b76c1bcb2bf1c05117df9719e46ad39c04fd1e2b643c89"} Apr 24 21:50:43.840065 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.839884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:43.855783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:43.855728 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podStartSLOduration=0.855710981 podStartE2EDuration="855.710981ms" podCreationTimestamp="2026-04-24 21:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:43.854090602 +0000 UTC m=+2066.259932054" watchObservedRunningTime="2026-04-24 21:50:43.855710981 +0000 UTC m=+2066.261552432" Apr 24 21:50:47.160776 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:47.160725 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:47.777507 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:47.777459 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 21:50:49.017560 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.017529 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:50:49.080493 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.080460 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls\") pod \"2d311244-c2f3-4350-a0d4-51588e56c67c\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " Apr 24 21:50:49.080668 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.080517 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d311244-c2f3-4350-a0d4-51588e56c67c-openshift-service-ca-bundle\") pod \"2d311244-c2f3-4350-a0d4-51588e56c67c\" (UID: \"2d311244-c2f3-4350-a0d4-51588e56c67c\") " Apr 24 21:50:49.080875 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.080850 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d311244-c2f3-4350-a0d4-51588e56c67c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2d311244-c2f3-4350-a0d4-51588e56c67c" (UID: "2d311244-c2f3-4350-a0d4-51588e56c67c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:49.082666 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.082638 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2d311244-c2f3-4350-a0d4-51588e56c67c" (UID: "2d311244-c2f3-4350-a0d4-51588e56c67c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:49.182145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.182045 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d311244-c2f3-4350-a0d4-51588e56c67c-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:49.182145 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.182082 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d311244-c2f3-4350-a0d4-51588e56c67c-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:49.849424 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.849389 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:50:49.860832 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.860793 2573 generic.go:358] "Generic (PLEG): container finished" podID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerID="547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7" exitCode=0 Apr 24 21:50:49.860995 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.860860 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" Apr 24 21:50:49.860995 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.860879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" event={"ID":"2d311244-c2f3-4350-a0d4-51588e56c67c","Type":"ContainerDied","Data":"547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7"} Apr 24 21:50:49.860995 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.860914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4" event={"ID":"2d311244-c2f3-4350-a0d4-51588e56c67c","Type":"ContainerDied","Data":"5ada125b317d1c0e4af39888fddb4ed365ef220fb6670e4a66dd191ea6725275"} Apr 24 21:50:49.860995 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.860933 2573 scope.go:117] "RemoveContainer" containerID="547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7" Apr 24 21:50:49.870537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.870518 2573 scope.go:117] "RemoveContainer" containerID="547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7" Apr 24 21:50:49.870935 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:49.870903 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7\": container with ID starting with 547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7 not found: ID does not exist" containerID="547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7" Apr 24 21:50:49.871018 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.870948 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7"} err="failed to get container status \"547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7\": rpc error: code = NotFound desc = could not find container \"547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7\": container with ID starting with 547cc7bed4fca935936e3d7a230d4590decdc6672598a15cabfbbb99f6935dd7 not found: ID does not exist" Apr 24 21:50:49.884215 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.884107 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4"] Apr 24 21:50:49.886469 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:49.886430 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4"] Apr 24 21:50:50.212116 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:50.212029 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" path="/var/lib/kubelet/pods/2d311244-c2f3-4350-a0d4-51588e56c67c/volumes" Apr 24 21:50:53.208985 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.208900 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89"] Apr 24 21:50:53.209394 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.209141 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" containerID="cri-o://99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba" gracePeriod=30 Apr 24 21:50:53.328245 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.328212 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f"] Apr 24 21:50:53.328629 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.328593 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" containerID="cri-o://8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3" gracePeriod=30 Apr 24 21:50:53.328786 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.328606 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kube-rbac-proxy" containerID="cri-o://b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c" gracePeriod=30 Apr 24 21:50:53.351236 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.351204 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj"] Apr 24 21:50:53.351606 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.351591 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" Apr 24 21:50:53.351675 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.351609 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" Apr 24 21:50:53.351718 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.351690 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d311244-c2f3-4350-a0d4-51588e56c67c" containerName="sequence-graph-64806" Apr 24 21:50:53.356561 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.356536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.358760 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.358735 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-12404-predictor-serving-cert\"" Apr 24 21:50:53.358887 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.358739 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-12404-kube-rbac-proxy-sar-config\"" Apr 24 21:50:53.363849 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.363815 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj"] Apr 24 21:50:53.422121 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.422089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.422337 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.422168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5accb5ac-4323-44b4-8f66-aa7380c8d59e-success-200-isvc-12404-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.422337 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.422220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk746\" (UniqueName: \"kubernetes.io/projected/5accb5ac-4323-44b4-8f66-aa7380c8d59e-kube-api-access-sk746\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.523078 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.523037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.523281 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.523102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5accb5ac-4323-44b4-8f66-aa7380c8d59e-success-200-isvc-12404-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.523281 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.523131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk746\" (UniqueName: \"kubernetes.io/projected/5accb5ac-4323-44b4-8f66-aa7380c8d59e-kube-api-access-sk746\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.523281 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:53.523182 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-12404-predictor-serving-cert: secret "success-200-isvc-12404-predictor-serving-cert" not found Apr 24 21:50:53.523281 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:53.523263 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls podName:5accb5ac-4323-44b4-8f66-aa7380c8d59e nodeName:}" failed. No retries permitted until 2026-04-24 21:50:54.023243171 +0000 UTC m=+2076.429084602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls") pod "success-200-isvc-12404-predictor-564cf4d979-76fsj" (UID: "5accb5ac-4323-44b4-8f66-aa7380c8d59e") : secret "success-200-isvc-12404-predictor-serving-cert" not found Apr 24 21:50:53.523775 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.523753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5accb5ac-4323-44b4-8f66-aa7380c8d59e-success-200-isvc-12404-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.532233 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.532195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk746\" (UniqueName: \"kubernetes.io/projected/5accb5ac-4323-44b4-8f66-aa7380c8d59e-kube-api-access-sk746\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:53.880379 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.880266 2573 generic.go:358] "Generic (PLEG): container finished" podID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerID="b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c" exitCode=2 Apr 24 21:50:53.880379 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:53.880325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" event={"ID":"b77acea4-fae8-402c-b34d-8ff4efaa4d78","Type":"ContainerDied","Data":"b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c"} Apr 24 21:50:54.028742 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.028707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:54.031206 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.031171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls\") pod \"success-200-isvc-12404-predictor-564cf4d979-76fsj\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:54.268768 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.268727 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:54.401039 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.400964 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj"] Apr 24 21:50:54.403970 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:50:54.403925 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5accb5ac_4323_44b4_8f66_aa7380c8d59e.slice/crio-ed56689989d0467ad9f296feb0d70e175b3e70bdaf63a96733e543d458ac9861 WatchSource:0}: Error finding container ed56689989d0467ad9f296feb0d70e175b3e70bdaf63a96733e543d458ac9861: Status 404 returned error can't find the container with id ed56689989d0467ad9f296feb0d70e175b3e70bdaf63a96733e543d458ac9861 Apr 24 21:50:54.848008 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.847913 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:54.885582 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.885543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" event={"ID":"5accb5ac-4323-44b4-8f66-aa7380c8d59e","Type":"ContainerStarted","Data":"27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48"} Apr 24 21:50:54.885582 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.885588 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" event={"ID":"5accb5ac-4323-44b4-8f66-aa7380c8d59e","Type":"ContainerStarted","Data":"f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e"} Apr 24 21:50:54.885783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.885602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" event={"ID":"5accb5ac-4323-44b4-8f66-aa7380c8d59e","Type":"ContainerStarted","Data":"ed56689989d0467ad9f296feb0d70e175b3e70bdaf63a96733e543d458ac9861"} Apr 24 21:50:54.885783 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.885656 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:54.902193 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:54.902142 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podStartSLOduration=1.9021249679999999 podStartE2EDuration="1.902124968s" podCreationTimestamp="2026-04-24 21:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:54.901047344 +0000 UTC m=+2077.306888831" watchObservedRunningTime="2026-04-24 21:50:54.902124968 +0000 UTC m=+2077.307966421" Apr 24 21:50:55.889349 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:55.889314 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:50:55.890993 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:55.890951 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:50:56.645340 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.645295 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 24 21:50:56.892165 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.892127 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:50:56.894473 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.894445 2573 generic.go:358] "Generic (PLEG): container finished" podID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerID="8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3" exitCode=0 Apr 24 21:50:56.894624 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.894559 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" Apr 24 21:50:56.894624 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.894571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" event={"ID":"b77acea4-fae8-402c-b34d-8ff4efaa4d78","Type":"ContainerDied","Data":"8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3"} Apr 24 21:50:56.894624 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.894617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f" event={"ID":"b77acea4-fae8-402c-b34d-8ff4efaa4d78","Type":"ContainerDied","Data":"3125ae6e9754b0eed166447eb50f6419edf90efe9b35405bde1b8581f9811258"} Apr 24 21:50:56.894809 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.894637 2573 scope.go:117] "RemoveContainer" containerID="b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c" Apr 24 21:50:56.895261 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.895231 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:50:56.903056 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.903035 2573 scope.go:117] "RemoveContainer" containerID="8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3" Apr 24 21:50:56.911840 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.911818 2573 scope.go:117] "RemoveContainer" containerID="b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c" Apr 24 21:50:56.912206 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:56.912178 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c\": container with ID starting with b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c not found: ID does not exist" containerID="b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c" Apr 24 21:50:56.912306 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.912219 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c"} err="failed to get container status \"b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c\": rpc error: code = NotFound desc = could not find container \"b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c\": container with ID starting with b572cb2e58682cc1f45f20d04f5531505667e27e201686b711ed5695816f033c not found: ID does not exist" Apr 24 21:50:56.912306 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.912243 2573 scope.go:117] "RemoveContainer" containerID="8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3" Apr 24 21:50:56.912638 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:50:56.912614 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3\": container with ID starting with 8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3 not found: ID does not exist" containerID="8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3" Apr 24 21:50:56.912757 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.912642 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3"} err="failed to get container status \"8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3\": rpc error: code = NotFound desc = could not find container \"8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3\": container with ID starting with 8ef75dd7a1f9c44020f11c43ab4bf8f2a0e91ceea9ea4feafa6dde12c7558ac3 not found: ID does not exist" Apr 24 21:50:56.952096 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.952058 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pnvs\" (UniqueName: \"kubernetes.io/projected/b77acea4-fae8-402c-b34d-8ff4efaa4d78-kube-api-access-5pnvs\") pod \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " Apr 24 21:50:56.952096 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.952100 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b77acea4-fae8-402c-b34d-8ff4efaa4d78-success-200-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " Apr 24 21:50:56.952375 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.952128 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b77acea4-fae8-402c-b34d-8ff4efaa4d78-proxy-tls\") pod \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\" (UID: \"b77acea4-fae8-402c-b34d-8ff4efaa4d78\") " Apr 24 21:50:56.952628 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.952602 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b77acea4-fae8-402c-b34d-8ff4efaa4d78-success-200-isvc-4efea-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-4efea-kube-rbac-proxy-sar-config") pod "b77acea4-fae8-402c-b34d-8ff4efaa4d78" (UID: "b77acea4-fae8-402c-b34d-8ff4efaa4d78"). InnerVolumeSpecName "success-200-isvc-4efea-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:56.954406 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.954383 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77acea4-fae8-402c-b34d-8ff4efaa4d78-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b77acea4-fae8-402c-b34d-8ff4efaa4d78" (UID: "b77acea4-fae8-402c-b34d-8ff4efaa4d78"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:56.954406 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:56.954395 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77acea4-fae8-402c-b34d-8ff4efaa4d78-kube-api-access-5pnvs" (OuterVolumeSpecName: "kube-api-access-5pnvs") pod "b77acea4-fae8-402c-b34d-8ff4efaa4d78" (UID: "b77acea4-fae8-402c-b34d-8ff4efaa4d78"). InnerVolumeSpecName "kube-api-access-5pnvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:57.053326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:57.053265 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5pnvs\" (UniqueName: \"kubernetes.io/projected/b77acea4-fae8-402c-b34d-8ff4efaa4d78-kube-api-access-5pnvs\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:57.053326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:57.053321 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b77acea4-fae8-402c-b34d-8ff4efaa4d78-success-200-isvc-4efea-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:57.053326 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:57.053333 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b77acea4-fae8-402c-b34d-8ff4efaa4d78-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:50:57.216749 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:57.216716 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f"] Apr 24 21:50:57.219866 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:57.219831 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f"] Apr 24 21:50:57.776785 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:57.776743 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 21:50:58.211206 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:58.211107 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" path="/var/lib/kubelet/pods/b77acea4-fae8-402c-b34d-8ff4efaa4d78/volumes" Apr 24 21:50:59.848613 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:50:59.848566 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:51:01.899756 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:01.899724 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:51:01.900327 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:01.900297 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:51:04.847327 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:04.847285 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:51:04.847716 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:04.847425 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:51:07.778139 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:07.778104 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 21:51:09.847079 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:09.847043 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:51:11.900511 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:11.900468 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:51:14.847748 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:14.847709 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:51:18.218921 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:18.218889 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:51:18.220958 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:18.220933 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:51:19.066289 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066255 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc"] Apr 24 21:51:19.066649 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066635 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kube-rbac-proxy" Apr 24 21:51:19.066693 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066652 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kube-rbac-proxy" Apr 24 21:51:19.066693 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066667 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" Apr 24 21:51:19.066693 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066673 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" Apr 24 21:51:19.066796 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066731 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kube-rbac-proxy" Apr 24 21:51:19.066796 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.066740 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b77acea4-fae8-402c-b34d-8ff4efaa4d78" containerName="kserve-container" Apr 24 21:51:19.071080 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.071055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.073566 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.073540 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-98aac-serving-cert\"" Apr 24 21:51:19.073679 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.073585 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-98aac-kube-rbac-proxy-sar-config\"" Apr 24 21:51:19.079074 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.079049 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc"] Apr 24 21:51:19.145890 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.145854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1535524-2257-4d12-a251-848068a29cc7-openshift-service-ca-bundle\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.146079 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.145919 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.246838 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.246801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1535524-2257-4d12-a251-848068a29cc7-openshift-service-ca-bundle\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.247216 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.246863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.247216 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:51:19.246959 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-98aac-serving-cert: secret "switch-graph-98aac-serving-cert" not found Apr 24 21:51:19.247216 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:51:19.247025 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls podName:d1535524-2257-4d12-a251-848068a29cc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:51:19.747008267 +0000 UTC m=+2102.152849697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls") pod "switch-graph-98aac-579b965bb4-xcntc" (UID: "d1535524-2257-4d12-a251-848068a29cc7") : secret "switch-graph-98aac-serving-cert" not found Apr 24 21:51:19.247507 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.247488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1535524-2257-4d12-a251-848068a29cc7-openshift-service-ca-bundle\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.751930 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.751885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.754519 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.754482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls\") pod \"switch-graph-98aac-579b965bb4-xcntc\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:19.848228 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.848186 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:51:19.982653 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:19.982620 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:20.108376 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:20.108331 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc"] Apr 24 21:51:20.978343 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:20.978307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" event={"ID":"d1535524-2257-4d12-a251-848068a29cc7","Type":"ContainerStarted","Data":"0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be"} Apr 24 21:51:20.978343 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:20.978344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" event={"ID":"d1535524-2257-4d12-a251-848068a29cc7","Type":"ContainerStarted","Data":"5075d8c5eabcfbeba59fc01b5eb6a2c34afc43729be0e91eed623a518cace08b"} Apr 24 21:51:20.978845 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:20.978453 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:20.995705 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:20.995649 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podStartSLOduration=1.9956343890000001 podStartE2EDuration="1.995634389s" podCreationTimestamp="2026-04-24 21:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:51:20.993484753 +0000 UTC m=+2103.399326205" watchObservedRunningTime="2026-04-24 21:51:20.995634389 +0000 UTC m=+2103.401475841" Apr 24 21:51:21.900222 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:21.900182 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:51:23.252817 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:51:23.252777 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b374bd5_2f71_4887_b183_48690a08cf17.slice/crio-99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:51:23.359459 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.359434 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:51:23.484363 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.484328 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b374bd5-2f71-4887-b183-48690a08cf17-proxy-tls\") pod \"7b374bd5-2f71-4887-b183-48690a08cf17\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " Apr 24 21:51:23.484537 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.484460 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b374bd5-2f71-4887-b183-48690a08cf17-openshift-service-ca-bundle\") pod \"7b374bd5-2f71-4887-b183-48690a08cf17\" (UID: \"7b374bd5-2f71-4887-b183-48690a08cf17\") " Apr 24 21:51:23.484839 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.484808 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b374bd5-2f71-4887-b183-48690a08cf17-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7b374bd5-2f71-4887-b183-48690a08cf17" (UID: "7b374bd5-2f71-4887-b183-48690a08cf17"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:51:23.486557 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.486528 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b374bd5-2f71-4887-b183-48690a08cf17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b374bd5-2f71-4887-b183-48690a08cf17" (UID: "7b374bd5-2f71-4887-b183-48690a08cf17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:51:23.585829 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.585775 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b374bd5-2f71-4887-b183-48690a08cf17-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:51:23.585829 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.585821 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b374bd5-2f71-4887-b183-48690a08cf17-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 21:51:23.990229 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.990190 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b374bd5-2f71-4887-b183-48690a08cf17" containerID="99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba" exitCode=0 Apr 24 21:51:23.990436 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.990274 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" Apr 24 21:51:23.990436 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.990273 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" event={"ID":"7b374bd5-2f71-4887-b183-48690a08cf17","Type":"ContainerDied","Data":"99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba"} Apr 24 21:51:23.990436 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.990383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89" event={"ID":"7b374bd5-2f71-4887-b183-48690a08cf17","Type":"ContainerDied","Data":"138ab2f988280bc0c6b76c1bcb2bf1c05117df9719e46ad39c04fd1e2b643c89"} Apr 24 21:51:23.990436 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.990401 2573 scope.go:117] "RemoveContainer" containerID="99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba" Apr 24 21:51:23.998511 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.998489 2573 scope.go:117] "RemoveContainer" containerID="99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba" Apr 24 21:51:23.998859 ip-10-0-128-21 kubenswrapper[2573]: E0424 21:51:23.998839 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba\": container with ID starting with 99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba not found: ID does not exist" containerID="99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba" Apr 24 21:51:23.998909 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:23.998870 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba"} err="failed to get container status \"99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba\": rpc error: code = NotFound desc = could not find container \"99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba\": container with ID starting with 99408c32dd12b1d6230d57c41517cb59d400fb4d9e36d8f32f85ccb7b4f55cba not found: ID does not exist" Apr 24 21:51:24.012717 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:24.012679 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89"] Apr 24 21:51:24.014438 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:24.014412 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89"] Apr 24 21:51:24.211868 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:24.211832 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" path="/var/lib/kubelet/pods/7b374bd5-2f71-4887-b183-48690a08cf17/volumes" Apr 24 21:51:26.988661 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:26.988633 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 21:51:31.900428 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:31.900388 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:51:41.901519 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:41.901487 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 21:51:53.416890 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.416856 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n"] Apr 24 21:51:53.417275 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.417208 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" Apr 24 21:51:53.417275 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.417218 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" Apr 24 21:51:53.417275 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.417273 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b374bd5-2f71-4887-b183-48690a08cf17" containerName="splitter-graph-4efea" Apr 24 21:51:53.420076 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.420053 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.422297 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.422279 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-12404-serving-cert\"" Apr 24 21:51:53.422416 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.422281 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-12404-kube-rbac-proxy-sar-config\"" Apr 24 21:51:53.427777 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.427744 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n"] Apr 24 21:51:53.521161 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.521132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58718984-9436-4079-8b1d-dfe5d1aab879-openshift-service-ca-bundle\") pod \"splitter-graph-12404-554567d986-qpd8n\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.521324 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.521183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58718984-9436-4079-8b1d-dfe5d1aab879-proxy-tls\") pod \"splitter-graph-12404-554567d986-qpd8n\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.621955 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.621918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58718984-9436-4079-8b1d-dfe5d1aab879-openshift-service-ca-bundle\") pod \"splitter-graph-12404-554567d986-qpd8n\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.622136 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.621986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58718984-9436-4079-8b1d-dfe5d1aab879-proxy-tls\") pod \"splitter-graph-12404-554567d986-qpd8n\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.622731 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.622713 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58718984-9436-4079-8b1d-dfe5d1aab879-openshift-service-ca-bundle\") pod \"splitter-graph-12404-554567d986-qpd8n\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.624481 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.624461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58718984-9436-4079-8b1d-dfe5d1aab879-proxy-tls\") pod \"splitter-graph-12404-554567d986-qpd8n\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.731921 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.731877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:53.854929 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:53.854899 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n"] Apr 24 21:51:53.857596 ip-10-0-128-21 kubenswrapper[2573]: W0424 21:51:53.857568 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58718984_9436_4079_8b1d_dfe5d1aab879.slice/crio-22e4bb286cc6d7a7d00616e810d42e4f3775d1b8343752609d5579e5e82387a0 WatchSource:0}: Error finding container 22e4bb286cc6d7a7d00616e810d42e4f3775d1b8343752609d5579e5e82387a0: Status 404 returned error can't find the container with id 22e4bb286cc6d7a7d00616e810d42e4f3775d1b8343752609d5579e5e82387a0 Apr 24 21:51:54.091752 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:54.091653 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" event={"ID":"58718984-9436-4079-8b1d-dfe5d1aab879","Type":"ContainerStarted","Data":"59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8"} Apr 24 21:51:54.091752 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:54.091691 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" event={"ID":"58718984-9436-4079-8b1d-dfe5d1aab879","Type":"ContainerStarted","Data":"22e4bb286cc6d7a7d00616e810d42e4f3775d1b8343752609d5579e5e82387a0"} Apr 24 21:51:54.091990 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:54.091787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:51:54.108450 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:51:54.108394 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podStartSLOduration=1.108374636 podStartE2EDuration="1.108374636s" podCreationTimestamp="2026-04-24 21:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:51:54.106579412 +0000 UTC m=+2136.512420881" watchObservedRunningTime="2026-04-24 21:51:54.108374636 +0000 UTC m=+2136.514216088" Apr 24 21:52:00.100917 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:52:00.100882 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 21:56:18.240505 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:56:18.240474 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 21:56:18.242461 ip-10-0-128-21 kubenswrapper[2573]: I0424 21:56:18.242431 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 22:00:08.080472 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.080380 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n"] Apr 24 22:00:08.080979 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.080686 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" containerID="cri-o://59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8" gracePeriod=30 Apr 24 22:00:08.217958 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.217925 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj"] Apr 24 22:00:08.218254 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.218232 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" containerID="cri-o://f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e" gracePeriod=30 Apr 24 22:00:08.218316 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.218244 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kube-rbac-proxy" containerID="cri-o://27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48" gracePeriod=30 Apr 24 22:00:08.706193 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.706095 2573 generic.go:358] "Generic (PLEG): container finished" podID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerID="27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48" exitCode=2 Apr 24 22:00:08.706193 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:08.706162 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" event={"ID":"5accb5ac-4323-44b4-8f66-aa7380c8d59e","Type":"ContainerDied","Data":"27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48"} Apr 24 22:00:10.099098 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:10.099058 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:00:11.896173 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:11.896131 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 24 22:00:11.900626 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:11.900598 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 22:00:15.099205 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.099166 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:00:15.171494 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.171469 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 22:00:15.233017 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.232979 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk746\" (UniqueName: \"kubernetes.io/projected/5accb5ac-4323-44b4-8f66-aa7380c8d59e-kube-api-access-sk746\") pod \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " Apr 24 22:00:15.233228 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.233032 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls\") pod \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " Apr 24 22:00:15.233228 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.233078 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5accb5ac-4323-44b4-8f66-aa7380c8d59e-success-200-isvc-12404-kube-rbac-proxy-sar-config\") pod \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\" (UID: \"5accb5ac-4323-44b4-8f66-aa7380c8d59e\") " Apr 24 22:00:15.233531 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.233498 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5accb5ac-4323-44b4-8f66-aa7380c8d59e-success-200-isvc-12404-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-12404-kube-rbac-proxy-sar-config") pod "5accb5ac-4323-44b4-8f66-aa7380c8d59e" (UID: "5accb5ac-4323-44b4-8f66-aa7380c8d59e"). InnerVolumeSpecName "success-200-isvc-12404-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:15.235157 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.235134 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5accb5ac-4323-44b4-8f66-aa7380c8d59e" (UID: "5accb5ac-4323-44b4-8f66-aa7380c8d59e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:15.235242 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.235222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5accb5ac-4323-44b4-8f66-aa7380c8d59e-kube-api-access-sk746" (OuterVolumeSpecName: "kube-api-access-sk746") pod "5accb5ac-4323-44b4-8f66-aa7380c8d59e" (UID: "5accb5ac-4323-44b4-8f66-aa7380c8d59e"). InnerVolumeSpecName "kube-api-access-sk746". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:15.333769 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.333673 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sk746\" (UniqueName: \"kubernetes.io/projected/5accb5ac-4323-44b4-8f66-aa7380c8d59e-kube-api-access-sk746\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:00:15.333769 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.333710 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5accb5ac-4323-44b4-8f66-aa7380c8d59e-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:00:15.333769 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.333728 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5accb5ac-4323-44b4-8f66-aa7380c8d59e-success-200-isvc-12404-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:00:15.730605 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.730569 2573 generic.go:358] "Generic (PLEG): container finished" podID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerID="f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e" exitCode=0 Apr 24 22:00:15.730811 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.730654 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" Apr 24 22:00:15.730811 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.730656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" event={"ID":"5accb5ac-4323-44b4-8f66-aa7380c8d59e","Type":"ContainerDied","Data":"f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e"} Apr 24 22:00:15.730811 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.730694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj" event={"ID":"5accb5ac-4323-44b4-8f66-aa7380c8d59e","Type":"ContainerDied","Data":"ed56689989d0467ad9f296feb0d70e175b3e70bdaf63a96733e543d458ac9861"} Apr 24 22:00:15.730811 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.730713 2573 scope.go:117] "RemoveContainer" containerID="27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48" Apr 24 22:00:15.739303 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.739280 2573 scope.go:117] "RemoveContainer" containerID="f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e" Apr 24 22:00:15.746960 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.746940 2573 scope.go:117] "RemoveContainer" containerID="27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48" Apr 24 22:00:15.747239 ip-10-0-128-21 kubenswrapper[2573]: E0424 22:00:15.747218 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48\": container with ID starting with 27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48 not found: ID does not exist" containerID="27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48" Apr 24 22:00:15.747282 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.747250 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48"} err="failed to get container status \"27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48\": rpc error: code = NotFound desc = could not find container \"27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48\": container with ID starting with 27b8558182aeaf7177f34cc147d2869bf05b27a30098257e38052001e6040f48 not found: ID does not exist" Apr 24 22:00:15.747282 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.747269 2573 scope.go:117] "RemoveContainer" containerID="f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e" Apr 24 22:00:15.747622 ip-10-0-128-21 kubenswrapper[2573]: E0424 22:00:15.747603 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e\": container with ID starting with f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e not found: ID does not exist" containerID="f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e" Apr 24 22:00:15.747702 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.747624 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e"} err="failed to get container status \"f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e\": rpc error: code = NotFound desc = could not find container \"f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e\": container with ID starting with f39b8278046b98684a5d8378ce73992a65cdb4a5fa2edc3297d76a31fa96d93e not found: ID does not exist" Apr 24 22:00:15.756279 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.756245 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj"] Apr 24 22:00:15.764610 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:15.764582 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj"] Apr 24 22:00:16.210916 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:16.210837 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" path="/var/lib/kubelet/pods/5accb5ac-4323-44b4-8f66-aa7380c8d59e/volumes" Apr 24 22:00:20.098964 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:20.098924 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:00:20.099341 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:20.099040 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 22:00:25.098973 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:25.098929 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:00:30.098801 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:30.098762 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:00:35.099178 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:35.099138 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:00:38.232073 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.232049 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 22:00:38.296471 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.296436 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58718984-9436-4079-8b1d-dfe5d1aab879-proxy-tls\") pod \"58718984-9436-4079-8b1d-dfe5d1aab879\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " Apr 24 22:00:38.296649 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.296507 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58718984-9436-4079-8b1d-dfe5d1aab879-openshift-service-ca-bundle\") pod \"58718984-9436-4079-8b1d-dfe5d1aab879\" (UID: \"58718984-9436-4079-8b1d-dfe5d1aab879\") " Apr 24 22:00:38.296928 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.296903 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58718984-9436-4079-8b1d-dfe5d1aab879-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "58718984-9436-4079-8b1d-dfe5d1aab879" (UID: "58718984-9436-4079-8b1d-dfe5d1aab879"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:38.298561 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.298533 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58718984-9436-4079-8b1d-dfe5d1aab879-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "58718984-9436-4079-8b1d-dfe5d1aab879" (UID: "58718984-9436-4079-8b1d-dfe5d1aab879"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:38.397077 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.396979 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58718984-9436-4079-8b1d-dfe5d1aab879-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:00:38.397077 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.397011 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58718984-9436-4079-8b1d-dfe5d1aab879-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:00:38.805810 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.805768 2573 generic.go:358] "Generic (PLEG): container finished" podID="58718984-9436-4079-8b1d-dfe5d1aab879" containerID="59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8" exitCode=0 Apr 24 22:00:38.805963 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.805862 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" Apr 24 22:00:38.805963 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.805852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" event={"ID":"58718984-9436-4079-8b1d-dfe5d1aab879","Type":"ContainerDied","Data":"59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8"} Apr 24 22:00:38.806051 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.805965 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n" event={"ID":"58718984-9436-4079-8b1d-dfe5d1aab879","Type":"ContainerDied","Data":"22e4bb286cc6d7a7d00616e810d42e4f3775d1b8343752609d5579e5e82387a0"} Apr 24 22:00:38.806051 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.805982 2573 scope.go:117] "RemoveContainer" containerID="59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8" Apr 24 22:00:38.813959 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.813935 2573 scope.go:117] "RemoveContainer" containerID="59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8" Apr 24 22:00:38.814253 ip-10-0-128-21 kubenswrapper[2573]: E0424 22:00:38.814225 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8\": container with ID starting with 59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8 not found: ID does not exist" containerID="59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8" Apr 24 22:00:38.814332 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.814267 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8"} err="failed to get container status \"59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8\": rpc error: code = NotFound desc = could not find container \"59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8\": container with ID starting with 59810fb959438696d5cd88d052b70c2157074d32b68443ed399a512b489474d8 not found: ID does not exist" Apr 24 22:00:38.827774 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.827742 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n"] Apr 24 22:00:38.831512 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:38.831486 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n"] Apr 24 22:00:40.210911 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:00:40.210876 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" path="/var/lib/kubelet/pods/58718984-9436-4079-8b1d-dfe5d1aab879/volumes" Apr 24 22:01:18.260303 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:01:18.260266 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 22:01:18.262845 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:01:18.262811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 22:06:18.280716 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:06:18.280686 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 22:06:18.285173 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:06:18.285147 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 22:07:38.354516 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:38.354460 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc"] Apr 24 22:07:38.355127 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:38.354762 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" containerID="cri-o://0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be" gracePeriod=30 Apr 24 22:07:38.474933 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:38.474894 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl"] Apr 24 22:07:38.475214 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:38.475185 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" containerID="cri-o://1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6" gracePeriod=30 Apr 24 22:07:38.475329 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:38.475210 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kube-rbac-proxy" containerID="cri-o://f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77" gracePeriod=30 Apr 24 22:07:39.033989 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:39.033952 2573 generic.go:358] "Generic (PLEG): container finished" podID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerID="f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77" exitCode=2 Apr 24 22:07:39.034164 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:39.034027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" event={"ID":"88b92159-1eab-4488-b61c-80aceb69ca9e","Type":"ContainerDied","Data":"f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77"} Apr 24 22:07:41.712621 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.712593 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 22:07:41.785217 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.785119 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88b92159-1eab-4488-b61c-80aceb69ca9e-success-200-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"88b92159-1eab-4488-b61c-80aceb69ca9e\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " Apr 24 22:07:41.785217 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.785169 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9qgz\" (UniqueName: \"kubernetes.io/projected/88b92159-1eab-4488-b61c-80aceb69ca9e-kube-api-access-l9qgz\") pod \"88b92159-1eab-4488-b61c-80aceb69ca9e\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " Apr 24 22:07:41.785217 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.785201 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls\") pod \"88b92159-1eab-4488-b61c-80aceb69ca9e\" (UID: \"88b92159-1eab-4488-b61c-80aceb69ca9e\") " Apr 24 22:07:41.785547 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.785473 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b92159-1eab-4488-b61c-80aceb69ca9e-success-200-isvc-98aac-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-98aac-kube-rbac-proxy-sar-config") pod "88b92159-1eab-4488-b61c-80aceb69ca9e" (UID: "88b92159-1eab-4488-b61c-80aceb69ca9e"). InnerVolumeSpecName "success-200-isvc-98aac-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:07:41.787337 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.787303 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "88b92159-1eab-4488-b61c-80aceb69ca9e" (UID: "88b92159-1eab-4488-b61c-80aceb69ca9e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:07:41.787477 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.787452 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b92159-1eab-4488-b61c-80aceb69ca9e-kube-api-access-l9qgz" (OuterVolumeSpecName: "kube-api-access-l9qgz") pod "88b92159-1eab-4488-b61c-80aceb69ca9e" (UID: "88b92159-1eab-4488-b61c-80aceb69ca9e"). InnerVolumeSpecName "kube-api-access-l9qgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:07:41.886090 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.886048 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88b92159-1eab-4488-b61c-80aceb69ca9e-success-200-isvc-98aac-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:07:41.886090 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.886084 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9qgz\" (UniqueName: \"kubernetes.io/projected/88b92159-1eab-4488-b61c-80aceb69ca9e-kube-api-access-l9qgz\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:07:41.886090 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.886096 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88b92159-1eab-4488-b61c-80aceb69ca9e-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:07:41.986215 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:41.986176 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:07:42.044576 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.044476 2573 generic.go:358] "Generic (PLEG): container finished" podID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerID="1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6" exitCode=0 Apr 24 22:07:42.044576 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.044525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" event={"ID":"88b92159-1eab-4488-b61c-80aceb69ca9e","Type":"ContainerDied","Data":"1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6"} Apr 24 22:07:42.044576 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.044547 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" Apr 24 22:07:42.044576 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.044577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl" event={"ID":"88b92159-1eab-4488-b61c-80aceb69ca9e","Type":"ContainerDied","Data":"130d1626e42580f7ea9ec72de540b8814ebc3564fab6c9684aaa812911590ee3"} Apr 24 22:07:42.044897 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.044599 2573 scope.go:117] "RemoveContainer" containerID="f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77" Apr 24 22:07:42.053329 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.053311 2573 scope.go:117] "RemoveContainer" containerID="1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6" Apr 24 22:07:42.061048 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.061027 2573 scope.go:117] "RemoveContainer" containerID="f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77" Apr 24 22:07:42.061347 ip-10-0-128-21 kubenswrapper[2573]: E0424 22:07:42.061325 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77\": container with ID starting with f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77 not found: ID does not exist" containerID="f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77" Apr 24 22:07:42.061416 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.061371 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77"} err="failed to get container status \"f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77\": rpc error: code = NotFound desc = could not find container \"f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77\": container with ID starting with f10f286f38ff6a404ea7c003b983b0269313f9f46411364ff17fcd1c2f8f9f77 not found: ID does not exist" Apr 24 22:07:42.061416 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.061389 2573 scope.go:117] "RemoveContainer" containerID="1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6" Apr 24 22:07:42.061640 ip-10-0-128-21 kubenswrapper[2573]: E0424 22:07:42.061623 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6\": container with ID starting with 1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6 not found: ID does not exist" containerID="1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6" Apr 24 22:07:42.061689 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.061644 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6"} err="failed to get container status \"1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6\": rpc error: code = NotFound desc = could not find container \"1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6\": container with ID starting with 1177b8c1589a4a6692623d1f5571d08b9b03fdc09dc3fa97f64ea0b25d502ba6 not found: ID does not exist" Apr 24 22:07:42.067603 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.067576 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl"] Apr 24 22:07:42.071117 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.071089 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl"] Apr 24 22:07:42.210163 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:42.210129 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" path="/var/lib/kubelet/pods/88b92159-1eab-4488-b61c-80aceb69ca9e/volumes" Apr 24 22:07:46.985676 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:46.985633 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:07:51.986216 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:51.986167 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:07:51.986619 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:51.986276 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 22:07:54.621787 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:54.621738 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:07:55.423045 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:55.422993 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:07:56.255463 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:56.255432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:07:56.985527 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:56.985486 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:07:57.087930 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:57.087898 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:07:57.871892 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:57.871802 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:07:58.712113 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:58.712082 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:07:59.508726 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:07:59.508694 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:08:00.285242 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:00.285211 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:08:01.140391 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:01.140324 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:08:01.976531 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:01.976493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:08:01.985544 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:01.985511 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:08:02.793839 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:02.793810 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:08:03.563255 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:03.563223 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-98aac-579b965bb4-xcntc_d1535524-2257-4d12-a251-848068a29cc7/switch-graph-98aac/0.log" Apr 24 22:08:06.986438 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:06.986401 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:08:08.503282 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.503254 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 22:08:08.527508 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.527427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-sbxgn_38a98561-29f3-47af-9151-b0d0095b287e/global-pull-secret-syncer/0.log" Apr 24 22:08:08.618627 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.618518 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1535524-2257-4d12-a251-848068a29cc7-openshift-service-ca-bundle\") pod \"d1535524-2257-4d12-a251-848068a29cc7\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " Apr 24 22:08:08.618779 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.618669 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls\") pod \"d1535524-2257-4d12-a251-848068a29cc7\" (UID: \"d1535524-2257-4d12-a251-848068a29cc7\") " Apr 24 22:08:08.618976 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.618949 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1535524-2257-4d12-a251-848068a29cc7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d1535524-2257-4d12-a251-848068a29cc7" (UID: "d1535524-2257-4d12-a251-848068a29cc7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:08:08.620869 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.620839 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d1535524-2257-4d12-a251-848068a29cc7" (UID: "d1535524-2257-4d12-a251-848068a29cc7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:08.624540 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.624515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ln7wc_7660fe75-2b1f-42c3-8bcf-b3fcc97a90ea/konnectivity-agent/0.log" Apr 24 22:08:08.678519 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.678485 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-21.ec2.internal_46129ef0396a1fbd001318ae09f161a9/haproxy/0.log" Apr 24 22:08:08.720314 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.720277 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1535524-2257-4d12-a251-848068a29cc7-proxy-tls\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:08:08.720314 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:08.720310 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1535524-2257-4d12-a251-848068a29cc7-openshift-service-ca-bundle\") on node \"ip-10-0-128-21.ec2.internal\" DevicePath \"\"" Apr 24 22:08:09.120985 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.120954 2573 generic.go:358] "Generic (PLEG): container finished" podID="d1535524-2257-4d12-a251-848068a29cc7" containerID="0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be" exitCode=0 Apr 24 22:08:09.121172 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.121023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" event={"ID":"d1535524-2257-4d12-a251-848068a29cc7","Type":"ContainerDied","Data":"0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be"} Apr 24 22:08:09.121172 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.121034 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" Apr 24 22:08:09.121172 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.121049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc" event={"ID":"d1535524-2257-4d12-a251-848068a29cc7","Type":"ContainerDied","Data":"5075d8c5eabcfbeba59fc01b5eb6a2c34afc43729be0e91eed623a518cace08b"} Apr 24 22:08:09.121172 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.121076 2573 scope.go:117] "RemoveContainer" containerID="0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be" Apr 24 22:08:09.129230 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.129212 2573 scope.go:117] "RemoveContainer" containerID="0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be" Apr 24 22:08:09.129554 ip-10-0-128-21 kubenswrapper[2573]: E0424 22:08:09.129531 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be\": container with ID starting with 0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be not found: ID does not exist" containerID="0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be" Apr 24 22:08:09.129634 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.129563 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be"} err="failed to get container status \"0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be\": rpc error: code = NotFound desc = could not find container \"0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be\": container with ID starting with 0a2818a8c1374720a3239f83c1c823460f64c271ccbc9f0ab3cfda14b6cce5be not found: ID does not exist" Apr 24 22:08:09.141248 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.141220 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc"] Apr 24 22:08:09.145266 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:09.145241 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc"] Apr 24 22:08:10.210671 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:10.210635 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1535524-2257-4d12-a251-848068a29cc7" path="/var/lib/kubelet/pods/d1535524-2257-4d12-a251-848068a29cc7/volumes" Apr 24 22:08:12.158796 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:12.158765 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-47dvk_32f8c25b-fb1f-4a40-b2ee-4f7db45184f1/cluster-monitoring-operator/0.log" Apr 24 22:08:12.310089 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:12.310055 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lmlrf_ab6c5ffc-45c6-4018-bff6-cb0476ccbcda/node-exporter/0.log" Apr 24 22:08:12.335337 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:12.335304 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lmlrf_ab6c5ffc-45c6-4018-bff6-cb0476ccbcda/kube-rbac-proxy/0.log" Apr 24 22:08:12.357919 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:12.357893 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lmlrf_ab6c5ffc-45c6-4018-bff6-cb0476ccbcda/init-textfile/0.log" Apr 24 22:08:14.200394 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:14.200349 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-6jmdp_fe0af93f-e6da-459a-b345-6cf8c4bcff2f/networking-console-plugin/0.log" Apr 24 22:08:14.641797 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:14.641760 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/1.log" Apr 24 22:08:14.646717 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:14.646692 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t99mx_f9353274-ce1e-479b-a277-0a36a39b6fb2/console-operator/2.log" Apr 24 22:08:15.414545 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.414502 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-lh2r9_62a47369-6a4f-4ac0-ae3b-559fb4cadc0d/volume-data-source-validator/0.log" Apr 24 22:08:15.703694 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.703611 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz"] Apr 24 22:08:15.703977 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.703964 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kube-rbac-proxy" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.703978 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kube-rbac-proxy" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.703993 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.703999 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704007 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704022 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704028 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" Apr 24 22:08:15.704035 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704036 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kube-rbac-proxy" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704041 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kube-rbac-proxy" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704047 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704052 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704097 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kserve-container" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704106 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="58718984-9436-4079-8b1d-dfe5d1aab879" containerName="splitter-graph-12404" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704114 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kube-rbac-proxy" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704120 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5accb5ac-4323-44b4-8f66-aa7380c8d59e" containerName="kserve-container" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704127 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="88b92159-1eab-4488-b61c-80aceb69ca9e" containerName="kube-rbac-proxy" Apr 24 22:08:15.704312 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.704133 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1535524-2257-4d12-a251-848068a29cc7" containerName="switch-graph-98aac" Apr 24 22:08:15.708291 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.708265 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.711474 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.711449 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bccfr\"/\"default-dockercfg-vcvfc\"" Apr 24 22:08:15.711609 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.711449 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bccfr\"/\"openshift-service-ca.crt\"" Apr 24 22:08:15.711609 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.711449 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bccfr\"/\"kube-root-ca.crt\"" Apr 24 22:08:15.716948 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.716925 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz"] Apr 24 22:08:15.881130 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.881086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-sys\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.881130 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.881132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-proc\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.881391 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.881204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvklj\" (UniqueName: \"kubernetes.io/projected/e610cefe-11a8-4780-8c61-102028354a87-kube-api-access-nvklj\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.881391 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.881264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-lib-modules\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.881391 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.881286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-podres\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982197 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-sys\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982197 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-proc\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982482 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvklj\" (UniqueName: \"kubernetes.io/projected/e610cefe-11a8-4780-8c61-102028354a87-kube-api-access-nvklj\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982482 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-proc\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982482 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-lib-modules\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982482 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-podres\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982482 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982334 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-sys\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982714 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-lib-modules\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.982714 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.982489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e610cefe-11a8-4780-8c61-102028354a87-podres\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:15.990490 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:15.990461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvklj\" (UniqueName: \"kubernetes.io/projected/e610cefe-11a8-4780-8c61-102028354a87-kube-api-access-nvklj\") pod \"perf-node-gather-daemonset-lfngz\" (UID: \"e610cefe-11a8-4780-8c61-102028354a87\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:16.019191 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.019153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:16.108887 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.108854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nvpkh_42e99775-4de5-4bed-b01a-a3218d41d996/dns/0.log" Apr 24 22:08:16.132883 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.132859 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nvpkh_42e99775-4de5-4bed-b01a-a3218d41d996/kube-rbac-proxy/0.log" Apr 24 22:08:16.143637 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.143608 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz"] Apr 24 22:08:16.147369 ip-10-0-128-21 kubenswrapper[2573]: W0424 22:08:16.147328 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode610cefe_11a8_4780_8c61_102028354a87.slice/crio-d67c51d17729f0c9e5b501e5972efba9266ce2e18073cf7ad21b32aecc48c149 WatchSource:0}: Error finding container d67c51d17729f0c9e5b501e5972efba9266ce2e18073cf7ad21b32aecc48c149: Status 404 returned error can't find the container with id d67c51d17729f0c9e5b501e5972efba9266ce2e18073cf7ad21b32aecc48c149 Apr 24 22:08:16.148841 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.148824 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:08:16.231678 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.231652 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cstvp_31fbbb71-5394-4f60-8de2-cc5dc970ab35/dns-node-resolver/0.log" Apr 24 22:08:16.670987 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.670954 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-689c867f4b-rpl44_3fed8766-c4ad-4312-ba21-25369a24b276/registry/0.log" Apr 24 22:08:16.713847 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:16.713809 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9crbq_88a73d58-a99e-49c1-9821-a06593a8b35e/node-ca/0.log" Apr 24 22:08:17.147566 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:17.147535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" event={"ID":"e610cefe-11a8-4780-8c61-102028354a87","Type":"ContainerStarted","Data":"a3df40e69f56b10560fc32ce8982de4493ae9656e0d29317baf512ec537df0c3"} Apr 24 22:08:17.147566 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:17.147571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" event={"ID":"e610cefe-11a8-4780-8c61-102028354a87","Type":"ContainerStarted","Data":"d67c51d17729f0c9e5b501e5972efba9266ce2e18073cf7ad21b32aecc48c149"} Apr 24 22:08:17.147792 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:17.147603 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:17.165587 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:17.165522 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" podStartSLOduration=2.165503928 podStartE2EDuration="2.165503928s" podCreationTimestamp="2026-04-24 22:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:08:17.164684297 +0000 UTC m=+3119.570525748" watchObservedRunningTime="2026-04-24 22:08:17.165503928 +0000 UTC m=+3119.571345380" Apr 24 22:08:17.424947 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:17.424866 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74fb58c7f4-9dgzg_5f412e6f-9e0c-44f5-b798-012969c57865/router/0.log" Apr 24 22:08:17.734645 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:17.734618 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-m4spp_78bcd5aa-9db8-46f4-ba2e-f6f3d929aa77/serve-healthcheck-canary/0.log" Apr 24 22:08:18.127055 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:18.126967 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dbr6c_36a65b6d-1c50-425c-911a-eb5c1059cd95/insights-operator/0.log" Apr 24 22:08:18.127995 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:18.127976 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dbr6c_36a65b6d-1c50-425c-911a-eb5c1059cd95/insights-operator/1.log" Apr 24 22:08:18.148699 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:18.148672 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7bz7w_7c6a0742-062f-4e70-97ca-7f2a1248b077/kube-rbac-proxy/0.log" Apr 24 22:08:18.168545 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:18.168518 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7bz7w_7c6a0742-062f-4e70-97ca-7f2a1248b077/exporter/0.log" Apr 24 22:08:18.188305 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:18.188279 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7bz7w_7c6a0742-062f-4e70-97ca-7f2a1248b077/extractor/0.log" Apr 24 22:08:20.197779 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:20.197742 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-74fc8f6f96-sb79s_639e7e62-db4e-4982-927c-bce6d3c3cee3/manager/0.log" Apr 24 22:08:20.239551 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:20.239525 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-nplhq_d7ce6ad8-5c35-44df-b56f-728edbc122a6/server/0.log" Apr 24 22:08:20.478215 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:20.478181 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6svcd_66ea0554-ebe2-401f-bfcd-ef36fd9ee74d/manager/0.log" Apr 24 22:08:23.159992 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:23.159958 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-lfngz" Apr 24 22:08:24.520274 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:24.520243 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2fhhh_d1b2baba-a138-4778-ad36-d2c72cf4b2d6/kube-storage-version-migrator-operator/1.log" Apr 24 22:08:24.521194 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:24.521175 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2fhhh_d1b2baba-a138-4778-ad36-d2c72cf4b2d6/kube-storage-version-migrator-operator/0.log" Apr 24 22:08:26.094477 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.094404 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/kube-multus-additional-cni-plugins/0.log" Apr 24 22:08:26.135435 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.135409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/egress-router-binary-copy/0.log" Apr 24 22:08:26.171440 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.171414 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/cni-plugins/0.log" Apr 24 22:08:26.213262 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.213233 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/bond-cni-plugin/0.log" Apr 24 22:08:26.250492 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.250461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/routeoverride-cni/0.log" Apr 24 22:08:26.291737 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.291709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/whereabouts-cni-bincopy/0.log" Apr 24 22:08:26.329397 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.329366 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cph25_239caad5-0402-47f0-8e15-7f5d02343638/whereabouts-cni/0.log" Apr 24 22:08:26.416894 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.416807 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mp2nj_804bc0fd-469c-45c8-8ece-8dbbfdb0705e/kube-multus/0.log" Apr 24 22:08:26.525203 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.525173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h5m79_a6ad0fc1-fbd1-4133-8616-3b950995f8e4/network-metrics-daemon/0.log" Apr 24 22:08:26.558447 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:26.558418 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h5m79_a6ad0fc1-fbd1-4133-8616-3b950995f8e4/kube-rbac-proxy/0.log" Apr 24 22:08:28.060660 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.060628 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/ovn-controller/0.log" Apr 24 22:08:28.102039 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.102005 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/ovn-acl-logging/0.log" Apr 24 22:08:28.121154 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.121121 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/kube-rbac-proxy-node/0.log" Apr 24 22:08:28.145822 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.145792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:08:28.166735 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.166701 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/northd/0.log" Apr 24 22:08:28.189328 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.189288 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/nbdb/0.log" Apr 24 22:08:28.212338 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.212306 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/sbdb/0.log" Apr 24 22:08:28.309369 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:28.309335 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rz2jk_abb55075-ce71-45a8-8ef8-400976104389/ovnkube-controller/0.log" Apr 24 22:08:29.313804 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:29.313768 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-mctfb_e7d157a6-5982-4a38-b8d0-15d88309963a/check-endpoints/0.log" Apr 24 22:08:29.382238 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:29.382203 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tzpnt_89ab8923-5f3a-4535-9d3f-e72f739904d4/network-check-target-container/0.log" Apr 24 22:08:30.265050 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:30.265025 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-m7gwx_880ca20b-7732-4709-9f0a-9013465ca003/iptables-alerter/0.log" Apr 24 22:08:30.938060 ip-10-0-128-21 kubenswrapper[2573]: I0424 22:08:30.938030 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5pbst_e7c79dc8-944a-4f71-8545-a3c37de6cdc2/tuned/0.log"