Apr 22 17:52:59.809970 ip-10-0-128-219 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:00.311204 ip-10-0-128-219 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:00.311204 ip-10-0-128-219 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:00.311204 ip-10-0-128-219 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:00.311204 ip-10-0-128-219 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:00.311204 ip-10-0-128-219 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:00.313031 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.312949 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:00.318054 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318040 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:00.318054 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318054 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318058 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318061 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318065 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318069 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318071 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318075 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318077 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318081 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318084 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318087 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318089 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318092 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318095 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318097 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318100 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318103 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318106 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318108 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318111 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:00.318113 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318113 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318116 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318120 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318124 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318127 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318131 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318133 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318136 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318139 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318141 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318144 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318147 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318149 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318152 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318155 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318157 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318160 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318162 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318164 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318167 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:00.318585 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318170 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318172 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318175 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318177 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318180 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318182 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318185 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318188 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318191 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318193 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318196 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318198 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318201 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318204 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318207 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318210 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318212 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318215 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318217 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318220 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:00.319081 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318223 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318228 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318232 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318234 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318237 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318240 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318243 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318245 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318248 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318250 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318253 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318257 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318259 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318262 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318264 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318267 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318270 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318272 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318275 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318278 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:00.319539 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318281 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318283 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318286 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318289 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318291 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318680 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318685 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318688 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318691 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318693 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318696 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318700 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318704 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318707 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318710 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318714 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318716 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318719 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318722 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318725 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:00.320032 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318728 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318730 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318733 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318735 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318738 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318740 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318743 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318745 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318748 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318751 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318754 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318756 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318759 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318761 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318764 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318766 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318769 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318772 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318775 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318777 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:00.320488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318780 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318783 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318786 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318788 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318791 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318793 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318796 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318798 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318802 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318804 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318807 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318810 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318813 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318815 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318817 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318820 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318822 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318825 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318827 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318830 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:00.321092 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318851 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318855 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318859 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318862 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318865 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318868 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318870 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318873 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318875 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318878 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318881 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318884 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318886 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318889 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318892 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318894 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318897 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318899 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318901 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:00.321599 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318906 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318910 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318912 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318915 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318918 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318920 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318923 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318925 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318928 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318931 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318933 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.318935 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319004 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319012 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319017 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319023 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319027 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319031 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319035 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319040 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319044 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:00.322061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319047 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319050 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319054 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319057 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319060 2564 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319063 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319066 2564 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319069 2564 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319071 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319074 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319078 2564 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319081 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319084 2564 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319087 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319091 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319094 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319098 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319101 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319104 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319108 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319110 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319113 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319116 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319119 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319124 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:00.322569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319126 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319130 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319133 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319136 2564 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319139 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319143 2564 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319146 2564 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319149 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319152 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319155 2564 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319159 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319162 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319175 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319179 2564 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319183 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319186 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319190 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319193 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319196 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319200 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319203 2564 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319207 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319210 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319213 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319217 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319220 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:00.323227 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319223 2564 flags.go:64] FLAG: --help="false" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319226 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319229 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319232 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319235 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319239 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319242 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319245 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319248 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319251 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319254 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319257 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319261 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319264 2564 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319267 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319269 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319272 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319275 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319278 2564 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319281 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319284 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319288 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319293 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:00.323861 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319296 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319299 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319302 2564 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319305 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319308 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319311 2564 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319314 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319318 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319322 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319326 2564 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319329 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319332 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319335 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319338 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319341 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319344 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319347 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319354 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319358 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319361 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319364 2564 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319367 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319373 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319376 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:00.324479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319379 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319382 2564 flags.go:64] FLAG: --port="10250" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319385 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319388 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c49437c5856aa94e" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319391 2564 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319394 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319397 2564 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319400 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319403 2564 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319407 2564 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319409 2564 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319412 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319415 2564 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319418 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319421 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319424 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319427 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319431 2564 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319434 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319437 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319440 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319443 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319445 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319448 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319451 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319454 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:00.325067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319457 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319464 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319467 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319470 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319473 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319476 2564 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319479 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319484 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319487 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319490 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319494 2564 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319496 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319499 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319502 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319505 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319508 2564 flags.go:64] FLAG: --v="2" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319512 2564 flags.go:64] FLAG: --version="false" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319516 2564 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319521 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.319524 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319614 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319618 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319621 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319624 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:00.325682 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319628 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319631 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319634 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319636 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319639 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319642 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319645 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319648 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319651 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319656 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319659 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319661 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319679 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319685 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319689 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319693 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319698 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319701 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319704 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319707 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:00.326280 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319709 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319712 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319714 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319717 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319720 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319722 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319725 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319727 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319730 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319733 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319735 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319738 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319741 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319744 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319747 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319750 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319753 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319755 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319758 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319760 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:00.326791 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319763 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319767 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319770 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319773 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319775 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319778 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319781 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319783 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319786 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319789 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319791 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319794 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319796 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319799 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319801 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319804 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319806 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319810 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319814 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:00.327275 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319817 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319819 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319822 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319827 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319829 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319832 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319835 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319838 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319840 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319843 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319845 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319849 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319852 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319856 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319860 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319863 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319865 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319868 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319870 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:00.327783 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319873 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319876 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319878 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.319881 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.320901 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.327192 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.327207 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327251 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327257 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327262 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327265 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327268 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327271 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327274 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327278 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:00.328230 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327281 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327283 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327286 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327289 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327292 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327295 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327298 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327301 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327304 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327306 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327309 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327312 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327314 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327317 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327320 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327322 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327325 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327327 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327330 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327332 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:00.328593 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327335 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327337 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327340 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327343 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327346 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327349 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327352 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327354 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327357 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327360 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327362 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327364 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327367 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327369 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327372 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327374 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327377 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327381 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327385 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327388 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:00.329103 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327391 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327394 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327396 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327399 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327402 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327405 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327407 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327410 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327412 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327414 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327417 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327420 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327422 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327425 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327427 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327431 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327434 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327436 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327439 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327442 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:00.329592 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327444 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327447 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327449 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327452 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327454 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327456 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327459 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327461 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327464 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327466 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327469 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327472 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327474 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327477 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327479 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327482 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327484 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:00.330086 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327487 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.327492 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327948 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327954 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327957 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327960 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327962 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327965 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327968 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327971 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327974 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327977 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327979 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327982 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327985 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:00.330488 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327987 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327990 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327993 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327995 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.327998 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328000 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328003 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328005 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328008 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328010 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328013 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328015 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328021 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328023 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328026 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328029 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328031 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328033 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328036 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:00.330861 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328038 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328041 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328043 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328046 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328048 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328051 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328053 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328056 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328059 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328061 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328064 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328067 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328069 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328072 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328074 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328077 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328079 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328082 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328084 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328087 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:00.331308 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328090 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328092 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328095 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328097 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328100 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328102 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328105 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328108 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328110 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328112 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328115 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328117 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328120 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328122 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328126 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328129 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328132 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328135 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328137 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:00.331794 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328140 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328144 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328147 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328150 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328153 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328155 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328158 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328160 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328163 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328165 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328168 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328170 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328173 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328175 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:00.328178 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:00.332247 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.328183 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:00.332621 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.328968 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:00.332877 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.332863 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:00.333877 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.333866 2564 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:00.334009 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.333992 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:00.334847 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.334837 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:00.362645 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.362628 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:00.367813 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.367788 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:00.380507 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.380489 2564 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:00.387003 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.386991 2564 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:00.389271 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.389256 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:00.393436 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.392655 2564 fs.go:135] Filesystem UUIDs: map[48be2b37-210f-4c41-a084-05b7195deee3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d9bb5849-9481-4a75-9b68-08c6a2666e53:/dev/nvme0n1p3] Apr 22 17:53:00.393528 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.393434 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:00.395232 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.395213 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:00.399079 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.398975 2564 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:00.397045728 +0000 UTC m=+0.455948861 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100531 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec227bc35183c7772b7faf82219350f7 SystemUUID:ec227bc3-5183-c777-2b7f-af82219350f7 BootID:44c9862d-db53-44d3-aec7-f0108baeb745 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6a:06:53:25:4d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6a:06:53:25:4d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e6:ee:d0:a8:65:65 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:00.399079 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.399071 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:00.399185 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.399143 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:00.401198 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.401178 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:00.401335 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.401199 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-219.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:00.401376 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.401341 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:00.401376 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.401349 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:00.401376 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.401366 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:00.402510 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.402499 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:00.403991 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.403981 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:00.404099 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.404090 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:00.406916 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.406907 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:00.406957 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.406924 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:00.406957 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.406937 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:00.406957 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.406946 2564 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:00.406957 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.406957 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:00.408098 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.408087 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:00.408147 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.408108 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:00.411535 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.411521 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:00.414914 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.414822 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:00.417139 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417121 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:00.417139 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417140 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417147 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417153 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417162 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417171 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417180 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417188 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417197 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417206 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417219 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:00.417245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417230 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:00.417959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.417942 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fctg9" Apr 22 17:53:00.419281 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.419271 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:00.419281 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.419281 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:00.420579 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.420555 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-219.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:53:00.420743 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.420723 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:53:00.422872 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.422860 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:00.422913 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.422896 2564 server.go:1295] "Started kubelet" Apr 22 17:53:00.423011 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.422986 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:00.423068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.422999 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:00.423117 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.423071 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:00.423536 ip-10-0-128-219 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:00.424560 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.424543 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:00.425094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.425081 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:00.426272 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.426253 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fctg9" Apr 22 17:53:00.427560 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.427545 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-219.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:53:00.428793 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.427733 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-219.ec2.internal.18a8bf4aab775daf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-219.ec2.internal,UID:ip-10-0-128-219.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-219.ec2.internal,},FirstTimestamp:2026-04-22 17:53:00.422872495 +0000 UTC m=+0.481775627,LastTimestamp:2026-04-22 17:53:00.422872495 +0000 UTC m=+0.481775627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-219.ec2.internal,}" Apr 22 17:53:00.432891 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.432875 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:53:00.434018 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.433999 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:00.434619 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.434599 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:00.435190 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435175 2564 factory.go:55] Registering systemd factory Apr 22 17:53:00.435190 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435189 2564 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:00.435360 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435350 2564 factory.go:153] Registering CRI-O factory Apr 22 17:53:00.435360 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435359 2564 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:00.435450 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435400 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:00.435450 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435414 2564 factory.go:103] Registering Raw factory Apr 22 17:53:00.435450 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435424 2564 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:00.435588 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435495 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:00.435588 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435500 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:00.435588 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435523 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:00.435588 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.435580 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.435772 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435618 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:00.435772 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435627 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:00.435881 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.435867 2564 manager.go:319] Starting recovery of all containers Apr 22 17:53:00.445625 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.445493 2564 manager.go:324] Recovery completed Apr 22 17:53:00.446630 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.446612 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:00.448235 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.448218 2564 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 17:53:00.449893 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.449873 2564 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-219.ec2.internal\" not found" node="ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.451130 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.451118 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:00.453602 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.453580 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:00.453659 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.453606 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:00.453659 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.453616 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:00.454104 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.454091 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:00.454104 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.454104 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:00.454229 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.454123 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:00.457975 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.457961 2564 policy_none.go:49] "None policy: Start" Apr 22 17:53:00.458055 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.457979 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:00.458055 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.457992 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:00.496229 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496213 2564 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.496253 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496267 2564 server.go:85] "Starting device plugin registration server" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496470 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496482 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496568 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496637 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.496646 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.497203 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:00.520959 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.497235 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.547207 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.547175 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:00.548483 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.548464 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:00.548562 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.548488 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:00.548562 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.548504 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:00.548562 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.548510 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:00.548562 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.548537 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:00.551419 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.551404 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:00.597631 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.597581 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:00.599093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.599079 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:00.599149 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.599107 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:00.599149 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.599116 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:00.599149 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.599140 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.608528 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.608514 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.608577 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.608534 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-219.ec2.internal\": node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.628037 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.628019 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.648961 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.648936 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal"] Apr 22 17:53:00.649009 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.648997 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:00.649740 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.649726 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:00.649818 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.649755 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:00.649818 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.649769 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:00.652080 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652067 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:00.652191 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652178 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.652236 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652210 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:00.652729 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652717 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:00.652784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652728 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:00.652784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652741 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:00.652784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652749 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:00.652784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652754 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:00.652784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.652760 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:00.654932 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.654918 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.654999 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.654956 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:00.655735 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.655719 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:00.655807 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.655742 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:00.655807 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.655752 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:00.681190 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.681172 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-219.ec2.internal\" not found" node="ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.685466 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.685451 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-219.ec2.internal\" not found" node="ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.728471 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.728450 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.737811 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.737790 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8d95ccda1ded644c4c87a2ac89e475b-config\") pod \"kube-apiserver-proxy-ip-10-0-128-219.ec2.internal\" (UID: \"e8d95ccda1ded644c4c87a2ac89e475b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.737887 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.737821 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a8c72f19fe67933975dcb887d58f7db-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal\" (UID: \"5a8c72f19fe67933975dcb887d58f7db\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.737887 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.737847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8c72f19fe67933975dcb887d58f7db-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal\" (UID: \"5a8c72f19fe67933975dcb887d58f7db\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.828884 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.828852 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.838264 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.838244 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a8c72f19fe67933975dcb887d58f7db-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal\" (UID: \"5a8c72f19fe67933975dcb887d58f7db\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.838350 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.838275 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a8c72f19fe67933975dcb887d58f7db-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal\" (UID: \"5a8c72f19fe67933975dcb887d58f7db\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.838350 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.838300 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8c72f19fe67933975dcb887d58f7db-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal\" (UID: \"5a8c72f19fe67933975dcb887d58f7db\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.838350 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.838279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8c72f19fe67933975dcb887d58f7db-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal\" (UID: \"5a8c72f19fe67933975dcb887d58f7db\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.838350 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.838337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8d95ccda1ded644c4c87a2ac89e475b-config\") pod \"kube-apiserver-proxy-ip-10-0-128-219.ec2.internal\" (UID: \"e8d95ccda1ded644c4c87a2ac89e475b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.838502 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.838399 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8d95ccda1ded644c4c87a2ac89e475b-config\") pod \"kube-apiserver-proxy-ip-10-0-128-219.ec2.internal\" (UID: \"e8d95ccda1ded644c4c87a2ac89e475b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.929651 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:00.929620 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:00.983066 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.983044 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:00.987483 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:00.987467 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" Apr 22 17:53:01.030170 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.030151 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.130678 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.130648 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.231139 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.231080 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.273621 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.273601 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:01.331790 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.331769 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.333927 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.333912 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:01.334056 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.334036 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:01.334099 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.334057 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:01.334099 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.334074 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:01.429458 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.429424 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:00 +0000 UTC" deadline="2027-11-30 17:07:55.265849148 +0000 UTC" Apr 22 17:53:01.429458 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.429451 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14087h14m53.836401023s" Apr 22 17:53:01.432451 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.432430 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.434578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.434562 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:01.446228 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.446204 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:01.472606 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.472583 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fjhtt" Apr 22 17:53:01.480156 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.480136 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fjhtt" Apr 22 17:53:01.532727 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.532650 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.569080 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:01.569052 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8c72f19fe67933975dcb887d58f7db.slice/crio-4e972b65e60128cff7cdf47b9d7e0f0d0f9a0e2084fe6b05f1c3df5e46b842b3 WatchSource:0}: Error finding container 4e972b65e60128cff7cdf47b9d7e0f0d0f9a0e2084fe6b05f1c3df5e46b842b3: Status 404 returned error can't find the container with id 4e972b65e60128cff7cdf47b9d7e0f0d0f9a0e2084fe6b05f1c3df5e46b842b3 Apr 22 17:53:01.574155 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.574140 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:01.580770 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:01.580751 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d95ccda1ded644c4c87a2ac89e475b.slice/crio-3b769b417f700a46f640882ca69faa36902cbc84aea0052ce54b76e673b42a47 WatchSource:0}: Error finding container 3b769b417f700a46f640882ca69faa36902cbc84aea0052ce54b76e673b42a47: Status 404 returned error can't find the container with id 3b769b417f700a46f640882ca69faa36902cbc84aea0052ce54b76e673b42a47 Apr 22 17:53:01.633338 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.633312 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.733852 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.733829 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.834352 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.834300 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.935082 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:01.935053 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-219.ec2.internal\" not found" Apr 22 17:53:01.994576 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:01.994547 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:02.035756 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.035726 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" Apr 22 17:53:02.049781 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.049710 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:02.050779 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.050759 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" Apr 22 17:53:02.059729 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.059608 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:02.408314 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.408288 2564 apiserver.go:52] "Watching apiserver" Apr 22 17:53:02.420304 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.420279 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:02.422198 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.422174 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7wtdq","openshift-multus/multus-additional-cni-plugins-kz84r","openshift-multus/network-metrics-daemon-t6kpj","kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm","openshift-cluster-node-tuning-operator/tuned-f69kk","openshift-dns/node-resolver-h4787","openshift-image-registry/node-ca-zs8rm","openshift-network-diagnostics/network-check-target-kbkmw","openshift-network-operator/iptables-alerter-8znkx","openshift-ovn-kubernetes/ovnkube-node-9vgh6","kube-system/konnectivity-agent-2gj6v","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal"] Apr 22 17:53:02.424777 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.424758 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.428072 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.428051 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:02.428159 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.428134 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hqmcr\"" Apr 22 17:53:02.428343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.428323 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.428422 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.428343 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.429122 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.429088 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.431198 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431179 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:02.431297 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.431262 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:02.431512 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431492 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:02.431512 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431526 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:02.431718 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431588 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:02.431718 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431528 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.431718 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431632 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-556xx\"" Apr 22 17:53:02.431916 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.431890 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.433493 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.433468 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.433582 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.433551 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.435772 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.435754 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.436890 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.436871 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:02.436996 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.436906 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.436996 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.436906 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.436996 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.436969 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.437193 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.437068 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fb6nv\"" Apr 22 17:53:02.437396 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.437346 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.437624 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.437609 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-94bz7\"" Apr 22 17:53:02.437848 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.437808 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.437941 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.437889 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.439113 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.439096 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.441403 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.441382 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:02.441509 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.441441 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:02.441828 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.441799 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.442037 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.442012 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:02.442037 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.442035 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.442247 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.442230 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ghrmk\"" Apr 22 17:53:02.442708 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.442649 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sg8sc\"" Apr 22 17:53:02.443734 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.443714 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.446172 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.446150 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.447621 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447599 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjlq\" (UniqueName: \"kubernetes.io/projected/a98e4a61-2b2f-4865-bb16-7be0e996db98-kube-api-access-6bjlq\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.447728 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447637 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ca34e7e4-d295-4bc4-adff-31d08074df10-iptables-alerter-script\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.447728 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447663 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cnibin\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.447728 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447709 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vncd5\" (UniqueName: \"kubernetes.io/projected/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-kube-api-access-vncd5\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.447855 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447753 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-modprobe-d\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.447855 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447793 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-hosts-file\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.447855 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447831 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-system-cni-dir\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.447983 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447879 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-os-release\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.447983 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447903 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.447983 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447930 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-kubernetes\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.447983 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447950 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysctl-conf\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.447983 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-sys\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448034 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmd7\" (UniqueName: \"kubernetes.io/projected/16541937-3840-42c5-8bc6-336bcf92d918-kube-api-access-fgmd7\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448067 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-tmp-dir\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.448166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448092 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.448166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448122 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.448166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448148 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-systemd\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448172 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-lib-modules\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448229 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-sys-fs\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448265 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysctl-d\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448302 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16541937-3840-42c5-8bc6-336bcf92d918-etc-tuned\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448329 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448354 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448378 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-registration-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.448409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448402 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysconfig\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448423 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-run\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448445 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-host\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448498 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclkb\" (UniqueName: \"kubernetes.io/projected/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-kube-api-access-pclkb\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448546 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448589 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-socket-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448616 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-device-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448639 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-var-lib-kubelet\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.448723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448686 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16541937-3840-42c5-8bc6-336bcf92d918-tmp\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.449038 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448735 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd6h7\" (UniqueName: \"kubernetes.io/projected/ca34e7e4-d295-4bc4-adff-31d08074df10-kube-api-access-cd6h7\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.449038 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448764 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca34e7e4-d295-4bc4-adff-31d08074df10-host-slash\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.449038 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448784 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hmt\" (UniqueName: \"kubernetes.io/projected/1273b1fd-25f6-4315-a692-c599fb3e48b7-kube-api-access-n9hmt\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:02.449038 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.448798 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-etc-selinux\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.449195 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.449099 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.449335 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.449299 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:02.449507 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.449485 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-589vt\"" Apr 22 17:53:02.450402 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.450376 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:02.450611 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.450593 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:02.450782 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.450593 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-99lft\"" Apr 22 17:53:02.450952 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.450881 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:02.451108 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.451046 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:02.451692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.451532 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:02.452459 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.452138 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:02.454325 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.452936 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:02.454325 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.453276 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fhsjw\"" Apr 22 17:53:02.454325 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.453405 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:02.481414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.481394 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:01 +0000 UTC" deadline="2027-12-20 01:24:35.508164023 +0000 UTC" Apr 22 17:53:02.481414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.481413 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14551h31m33.026752926s" Apr 22 17:53:02.537294 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.537270 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:02.549964 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.549936 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysctl-d\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550083 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.549975 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-cni-bin\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.550083 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.549992 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-cni-multus\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.550083 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550083 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550027 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovn-node-metrics-cert\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550083 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysconfig\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550083 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550072 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-run\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550087 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-device-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550112 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5b6cdf0f-46d3-4ba2-8a30-8314baac3007-konnectivity-ca\") pod \"konnectivity-agent-2gj6v\" (UID: \"5b6cdf0f-46d3-4ba2-8a30-8314baac3007\") " pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550125 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysconfig\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550125 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysctl-d\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550167 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-run\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550156 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-kubelet\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550178 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-device-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550200 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-env-overrides\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550224 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovnkube-script-lib\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550252 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca34e7e4-d295-4bc4-adff-31d08074df10-host-slash\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550306 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hmt\" (UniqueName: \"kubernetes.io/projected/1273b1fd-25f6-4315-a692-c599fb3e48b7-kube-api-access-n9hmt\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:02.550332 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550322 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca34e7e4-d295-4bc4-adff-31d08074df10-host-slash\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550340 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/221a3ce4-df39-49c4-9142-6acb37f99613-multus-daemon-config\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550370 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550397 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twcff\" (UniqueName: \"kubernetes.io/projected/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-kube-api-access-twcff\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ca34e7e4-d295-4bc4-adff-31d08074df10-iptables-alerter-script\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550472 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vncd5\" (UniqueName: \"kubernetes.io/projected/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-kube-api-access-vncd5\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550588 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkbqj\" (UniqueName: \"kubernetes.io/projected/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-kube-api-access-qkbqj\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550622 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5b6cdf0f-46d3-4ba2-8a30-8314baac3007-agent-certs\") pod \"konnectivity-agent-2gj6v\" (UID: \"5b6cdf0f-46d3-4ba2-8a30-8314baac3007\") " pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550647 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550685 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-cni-bin\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550714 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-modprobe-d\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550739 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-hosts-file\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550764 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-system-cni-dir\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550810 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-system-cni-dir\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550827 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-hosts-file\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550850 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-modprobe-d\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.550878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550864 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-os-release\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550894 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-cnibin\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550923 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-conf-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550957 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8h4\" (UniqueName: \"kubernetes.io/projected/221a3ce4-df39-49c4-9142-6acb37f99613-kube-api-access-5m8h4\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550960 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-os-release\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.550985 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-var-lib-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551020 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-kubernetes\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551051 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysctl-conf\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551081 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-sys\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551087 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ca34e7e4-d295-4bc4-adff-31d08074df10-iptables-alerter-script\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-kubernetes\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551105 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-systemd\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551133 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-lib-modules\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551155 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-sys\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551160 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/221a3ce4-df39-49c4-9142-6acb37f99613-cni-binary-copy\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551172 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-sysctl-conf\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551187 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-etc-kubernetes\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.551591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551172 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-etc-systemd\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551226 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-slash\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551254 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovnkube-config\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551267 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-lib-modules\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551283 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16541937-3840-42c5-8bc6-336bcf92d918-etc-tuned\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551341 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551397 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-registration-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551422 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-k8s-cni-cncf-io\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551489 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-registration-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.551515 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551620 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-hostroot\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.551633 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:03.051608127 +0000 UTC m=+3.110511268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551640 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551697 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-multus-certs\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551725 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-host\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551715 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:02.552343 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551754 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pclkb\" (UniqueName: \"kubernetes.io/projected/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-kube-api-access-pclkb\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551812 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-socket-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-host\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551867 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-etc-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551891 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-ovn\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551918 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-var-lib-kubelet\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16541937-3840-42c5-8bc6-336bcf92d918-tmp\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551971 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd6h7\" (UniqueName: \"kubernetes.io/projected/ca34e7e4-d295-4bc4-adff-31d08074df10-kube-api-access-cd6h7\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551994 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-socket-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.551997 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-serviceca\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552035 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-systemd-units\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552063 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-etc-selinux\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552069 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16541937-3840-42c5-8bc6-336bcf92d918-var-lib-kubelet\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjlq\" (UniqueName: \"kubernetes.io/projected/a98e4a61-2b2f-4865-bb16-7be0e996db98-kube-api-access-6bjlq\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552115 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-system-cni-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552143 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-socket-dir-parent\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.553094 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552170 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-systemd\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552195 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-cni-netd\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552222 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cnibin\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-os-release\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552265 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-etc-selinux\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552289 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-netns\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552315 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-kubelet\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552350 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-node-log\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552364 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cnibin\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552382 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552381 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552471 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-log-socket\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552507 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmd7\" (UniqueName: \"kubernetes.io/projected/16541937-3840-42c5-8bc6-336bcf92d918-kube-api-access-fgmd7\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552532 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-tmp-dir\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552556 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552605 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-host\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.553860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552628 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-cni-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552682 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-sys-fs\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552706 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-run-netns\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552746 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552828 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552851 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.552890 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a98e4a61-2b2f-4865-bb16-7be0e996db98-sys-fs\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.553036 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.553336 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-tmp-dir\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.554590 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.554159 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" event={"ID":"e8d95ccda1ded644c4c87a2ac89e475b","Type":"ContainerStarted","Data":"3b769b417f700a46f640882ca69faa36902cbc84aea0052ce54b76e673b42a47"} Apr 22 17:53:02.555353 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.555321 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" event={"ID":"5a8c72f19fe67933975dcb887d58f7db","Type":"ContainerStarted","Data":"4e972b65e60128cff7cdf47b9d7e0f0d0f9a0e2084fe6b05f1c3df5e46b842b3"} Apr 22 17:53:02.555536 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.555507 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16541937-3840-42c5-8bc6-336bcf92d918-tmp\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.555612 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.555584 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16541937-3840-42c5-8bc6-336bcf92d918-etc-tuned\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.575974 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.575952 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmd7\" (UniqueName: \"kubernetes.io/projected/16541937-3840-42c5-8bc6-336bcf92d918-kube-api-access-fgmd7\") pod \"tuned-f69kk\" (UID: \"16541937-3840-42c5-8bc6-336bcf92d918\") " pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.576288 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.576264 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vncd5\" (UniqueName: \"kubernetes.io/projected/a3fc6e2c-71c7-4da3-b348-d4a5b505f72a-kube-api-access-vncd5\") pod \"multus-additional-cni-plugins-kz84r\" (UID: \"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a\") " pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.576921 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.576882 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjlq\" (UniqueName: \"kubernetes.io/projected/a98e4a61-2b2f-4865-bb16-7be0e996db98-kube-api-access-6bjlq\") pod \"aws-ebs-csi-driver-node-rf9dm\" (UID: \"a98e4a61-2b2f-4865-bb16-7be0e996db98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.577363 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.577290 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd6h7\" (UniqueName: \"kubernetes.io/projected/ca34e7e4-d295-4bc4-adff-31d08074df10-kube-api-access-cd6h7\") pod \"iptables-alerter-8znkx\" (UID: \"ca34e7e4-d295-4bc4-adff-31d08074df10\") " pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.578253 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.578209 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclkb\" (UniqueName: \"kubernetes.io/projected/8432b695-5ba0-4b5f-bf6e-aea43e93c1a0-kube-api-access-pclkb\") pod \"node-resolver-h4787\" (UID: \"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0\") " pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.578329 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.578306 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hmt\" (UniqueName: \"kubernetes.io/projected/1273b1fd-25f6-4315-a692-c599fb3e48b7-kube-api-access-n9hmt\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:02.653508 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653478 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/221a3ce4-df39-49c4-9142-6acb37f99613-multus-daemon-config\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.653648 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.653648 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653542 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twcff\" (UniqueName: \"kubernetes.io/projected/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-kube-api-access-twcff\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.653648 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653567 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkbqj\" (UniqueName: \"kubernetes.io/projected/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-kube-api-access-qkbqj\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.653648 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653589 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5b6cdf0f-46d3-4ba2-8a30-8314baac3007-agent-certs\") pod \"konnectivity-agent-2gj6v\" (UID: \"5b6cdf0f-46d3-4ba2-8a30-8314baac3007\") " pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.653648 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653596 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.653648 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653612 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:02.653874 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653651 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-cni-bin\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.653874 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653714 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-cni-bin\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.653874 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653831 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-cnibin\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.653874 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653869 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-conf-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654042 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653903 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8h4\" (UniqueName: \"kubernetes.io/projected/221a3ce4-df39-49c4-9142-6acb37f99613-kube-api-access-5m8h4\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654042 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653946 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-cnibin\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654042 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653946 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-conf-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654042 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.653971 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-var-lib-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654042 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654009 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/221a3ce4-df39-49c4-9142-6acb37f99613-cni-binary-copy\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654042 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654034 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-etc-kubernetes\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654055 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-slash\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-var-lib-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654078 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovnkube-config\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654099 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-etc-kubernetes\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654108 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-slash\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654111 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/221a3ce4-df39-49c4-9142-6acb37f99613-multus-daemon-config\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654125 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-k8s-cni-cncf-io\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654154 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-hostroot\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654162 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-k8s-cni-cncf-io\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-multus-certs\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654203 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-etc-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654207 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-hostroot\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-ovn\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654248 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-multus-certs\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-serviceca\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654287 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-ovn\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.654320 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654285 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-etc-openvswitch\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654292 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-systemd-units\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654402 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-system-cni-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654417 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-systemd-units\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-socket-dir-parent\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654453 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-systemd\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654462 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-system-cni-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654474 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-cni-netd\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654495 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-socket-dir-parent\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654499 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-os-release\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654510 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/221a3ce4-df39-49c4-9142-6acb37f99613-cni-binary-copy\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654525 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-netns\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654538 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-cni-netd\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-kubelet\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654574 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-os-release\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654517 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-run-systemd\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654562 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-run-netns\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-node-log\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655115 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654597 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-kubelet\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654588 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovnkube-config\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654637 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-node-log\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654690 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-log-socket\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654722 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-host\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654736 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-log-socket\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654691 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-serviceca\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654747 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-cni-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654791 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-multus-cni-dir\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654796 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-host\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654811 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-run-netns\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654849 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654876 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-cni-bin\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654897 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-run-netns\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654901 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-cni-multus\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654934 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654938 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-cni-multus\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.655697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654967 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovn-node-metrics-cert\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.654975 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655014 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/221a3ce4-df39-49c4-9142-6acb37f99613-host-var-lib-cni-bin\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655043 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5b6cdf0f-46d3-4ba2-8a30-8314baac3007-konnectivity-ca\") pod \"konnectivity-agent-2gj6v\" (UID: \"5b6cdf0f-46d3-4ba2-8a30-8314baac3007\") " pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655172 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-kubelet\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655201 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-env-overrides\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655225 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovnkube-script-lib\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655277 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-host-kubelet\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655583 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5b6cdf0f-46d3-4ba2-8a30-8314baac3007-konnectivity-ca\") pod \"konnectivity-agent-2gj6v\" (UID: \"5b6cdf0f-46d3-4ba2-8a30-8314baac3007\") " pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655655 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-env-overrides\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656377 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.655748 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovnkube-script-lib\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.656717 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.656378 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5b6cdf0f-46d3-4ba2-8a30-8314baac3007-agent-certs\") pod \"konnectivity-agent-2gj6v\" (UID: \"5b6cdf0f-46d3-4ba2-8a30-8314baac3007\") " pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:02.657105 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.657084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-ovn-node-metrics-cert\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.664400 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.664354 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:02.664400 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.664370 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:02.664400 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.664381 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:02.664569 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:02.664442 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:03.164425691 +0000 UTC m=+3.223328823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:02.664791 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.664768 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:02.666681 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.666650 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcff\" (UniqueName: \"kubernetes.io/projected/9d8dc6eb-7c99-4548-8b6d-fe9f31000478-kube-api-access-twcff\") pod \"ovnkube-node-9vgh6\" (UID: \"9d8dc6eb-7c99-4548-8b6d-fe9f31000478\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.666971 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.666930 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8h4\" (UniqueName: \"kubernetes.io/projected/221a3ce4-df39-49c4-9142-6acb37f99613-kube-api-access-5m8h4\") pod \"multus-7wtdq\" (UID: \"221a3ce4-df39-49c4-9142-6acb37f99613\") " pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.667490 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.667474 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkbqj\" (UniqueName: \"kubernetes.io/projected/dfd9fa90-7a02-4429-a3c2-c939fa96e48e-kube-api-access-qkbqj\") pod \"node-ca-zs8rm\" (UID: \"dfd9fa90-7a02-4429-a3c2-c939fa96e48e\") " pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.741418 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.741389 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8znkx" Apr 22 17:53:02.742567 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.742547 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:02.747889 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.747868 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kz84r" Apr 22 17:53:02.757525 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.757506 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" Apr 22 17:53:02.763092 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.763076 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-f69kk" Apr 22 17:53:02.770581 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.770564 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h4787" Apr 22 17:53:02.775098 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.775079 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zs8rm" Apr 22 17:53:02.781733 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.781710 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7wtdq" Apr 22 17:53:02.787207 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.787192 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:02.792833 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:02.792809 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:03.057311 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.057282 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:03.057494 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:03.057401 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:03.057494 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:03.057463 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:04.057446194 +0000 UTC m=+4.116349316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:03.233961 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.233936 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b6cdf0f_46d3_4ba2_8a30_8314baac3007.slice/crio-6704914f5489c1911cdead092d47007d990f9e4da34a9888db63effd0472f48d WatchSource:0}: Error finding container 6704914f5489c1911cdead092d47007d990f9e4da34a9888db63effd0472f48d: Status 404 returned error can't find the container with id 6704914f5489c1911cdead092d47007d990f9e4da34a9888db63effd0472f48d Apr 22 17:53:03.235690 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.235547 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8dc6eb_7c99_4548_8b6d_fe9f31000478.slice/crio-31524d9f7be0d6035e38333ade9fad310eae9aa659285e67622999c9f26a0851 WatchSource:0}: Error finding container 31524d9f7be0d6035e38333ade9fad310eae9aa659285e67622999c9f26a0851: Status 404 returned error can't find the container with id 31524d9f7be0d6035e38333ade9fad310eae9aa659285e67622999c9f26a0851 Apr 22 17:53:03.238730 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.238709 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221a3ce4_df39_49c4_9142_6acb37f99613.slice/crio-bd53ab9a1ee2258d119c4c122bdecb2d0dc21f233e88e65ddc0a0b0ba2acbb9f WatchSource:0}: Error finding container bd53ab9a1ee2258d119c4c122bdecb2d0dc21f233e88e65ddc0a0b0ba2acbb9f: Status 404 returned error can't find the container with id bd53ab9a1ee2258d119c4c122bdecb2d0dc21f233e88e65ddc0a0b0ba2acbb9f Apr 22 17:53:03.258239 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.258221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:03.258380 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.258359 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8432b695_5ba0_4b5f_bf6e_aea43e93c1a0.slice/crio-40044ede8dc0f1bf0025a2bb1d750891d4e6d8094fe4aa1d67806fd876d00810 WatchSource:0}: Error finding container 40044ede8dc0f1bf0025a2bb1d750891d4e6d8094fe4aa1d67806fd876d00810: Status 404 returned error can't find the container with id 40044ede8dc0f1bf0025a2bb1d750891d4e6d8094fe4aa1d67806fd876d00810 Apr 22 17:53:03.258380 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:03.258371 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:03.258527 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:03.258390 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:03.258527 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:03.258401 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:03.258527 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:03.258464 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:04.258443149 +0000 UTC m=+4.317346282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:03.258937 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.258881 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca34e7e4_d295_4bc4_adff_31d08074df10.slice/crio-60973dee2627269cb8d07374f38aec090a847c4242401693e54929c034ab08dd WatchSource:0}: Error finding container 60973dee2627269cb8d07374f38aec090a847c4242401693e54929c034ab08dd: Status 404 returned error can't find the container with id 60973dee2627269cb8d07374f38aec090a847c4242401693e54929c034ab08dd Apr 22 17:53:03.259829 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.259794 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16541937_3840_42c5_8bc6_336bcf92d918.slice/crio-f76631432f4a21f09e165a351be5c7eb76539f18f66b8b64eeb2d3a1f9e0af6f WatchSource:0}: Error finding container f76631432f4a21f09e165a351be5c7eb76539f18f66b8b64eeb2d3a1f9e0af6f: Status 404 returned error can't find the container with id f76631432f4a21f09e165a351be5c7eb76539f18f66b8b64eeb2d3a1f9e0af6f Apr 22 17:53:03.260796 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.260743 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fc6e2c_71c7_4da3_b348_d4a5b505f72a.slice/crio-3e11952eab4d261bc9ae722a5b3fe009d31a408dbe8b1abb232edabb7299df24 WatchSource:0}: Error finding container 3e11952eab4d261bc9ae722a5b3fe009d31a408dbe8b1abb232edabb7299df24: Status 404 returned error can't find the container with id 3e11952eab4d261bc9ae722a5b3fe009d31a408dbe8b1abb232edabb7299df24 Apr 22 17:53:03.261887 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.261597 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda98e4a61_2b2f_4865_bb16_7be0e996db98.slice/crio-e46d2376427b99c3dea6e152c65c7f56c075cd7d94df9a8e1072351a5cb5f38b WatchSource:0}: Error finding container e46d2376427b99c3dea6e152c65c7f56c075cd7d94df9a8e1072351a5cb5f38b: Status 404 returned error can't find the container with id e46d2376427b99c3dea6e152c65c7f56c075cd7d94df9a8e1072351a5cb5f38b Apr 22 17:53:03.262438 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:03.262418 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd9fa90_7a02_4429_a3c2_c939fa96e48e.slice/crio-4fa5c4fb9353bc5cefe48556984d469df8313553983ebef090160cf50a274ed2 WatchSource:0}: Error finding container 4fa5c4fb9353bc5cefe48556984d469df8313553983ebef090160cf50a274ed2: Status 404 returned error can't find the container with id 4fa5c4fb9353bc5cefe48556984d469df8313553983ebef090160cf50a274ed2 Apr 22 17:53:03.482279 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.482247 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:01 +0000 UTC" deadline="2027-11-09 09:37:56.093125632 +0000 UTC" Apr 22 17:53:03.482279 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.482274 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13575h44m52.610854377s" Apr 22 17:53:03.552238 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.552067 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:03.558659 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.558604 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerStarted","Data":"3e11952eab4d261bc9ae722a5b3fe009d31a408dbe8b1abb232edabb7299df24"} Apr 22 17:53:03.559686 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.559642 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-f69kk" event={"ID":"16541937-3840-42c5-8bc6-336bcf92d918","Type":"ContainerStarted","Data":"f76631432f4a21f09e165a351be5c7eb76539f18f66b8b64eeb2d3a1f9e0af6f"} Apr 22 17:53:03.560659 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.560634 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h4787" event={"ID":"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0","Type":"ContainerStarted","Data":"40044ede8dc0f1bf0025a2bb1d750891d4e6d8094fe4aa1d67806fd876d00810"} Apr 22 17:53:03.561934 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.561910 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7wtdq" event={"ID":"221a3ce4-df39-49c4-9142-6acb37f99613","Type":"ContainerStarted","Data":"bd53ab9a1ee2258d119c4c122bdecb2d0dc21f233e88e65ddc0a0b0ba2acbb9f"} Apr 22 17:53:03.563020 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.562994 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2gj6v" event={"ID":"5b6cdf0f-46d3-4ba2-8a30-8314baac3007","Type":"ContainerStarted","Data":"6704914f5489c1911cdead092d47007d990f9e4da34a9888db63effd0472f48d"} Apr 22 17:53:03.564893 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.564648 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" event={"ID":"e8d95ccda1ded644c4c87a2ac89e475b","Type":"ContainerStarted","Data":"c24ca0d20d77853c449ef031579b207628eb9396ee3406d56b8f1d5c8ea16c09"} Apr 22 17:53:03.568848 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.568767 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zs8rm" event={"ID":"dfd9fa90-7a02-4429-a3c2-c939fa96e48e","Type":"ContainerStarted","Data":"4fa5c4fb9353bc5cefe48556984d469df8313553983ebef090160cf50a274ed2"} Apr 22 17:53:03.569724 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.569696 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" event={"ID":"a98e4a61-2b2f-4865-bb16-7be0e996db98","Type":"ContainerStarted","Data":"e46d2376427b99c3dea6e152c65c7f56c075cd7d94df9a8e1072351a5cb5f38b"} Apr 22 17:53:03.570730 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.570709 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8znkx" event={"ID":"ca34e7e4-d295-4bc4-adff-31d08074df10","Type":"ContainerStarted","Data":"60973dee2627269cb8d07374f38aec090a847c4242401693e54929c034ab08dd"} Apr 22 17:53:03.571981 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.571960 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"31524d9f7be0d6035e38333ade9fad310eae9aa659285e67622999c9f26a0851"} Apr 22 17:53:03.577609 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:03.577523 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-219.ec2.internal" podStartSLOduration=1.577509245 podStartE2EDuration="1.577509245s" podCreationTimestamp="2026-04-22 17:53:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:03.576942133 +0000 UTC m=+3.635845279" watchObservedRunningTime="2026-04-22 17:53:03.577509245 +0000 UTC m=+3.636412389" Apr 22 17:53:04.064528 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:04.063964 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:04.064528 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.064109 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:04.064528 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.064172 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:06.064153791 +0000 UTC m=+6.123056916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:04.266549 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:04.266511 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:04.266727 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.266688 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:04.266727 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.266711 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:04.266727 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.266724 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:04.266895 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.266782 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:06.266762044 +0000 UTC m=+6.325665179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:04.550166 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:04.549701 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:04.550166 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.549835 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:04.551095 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:04.550901 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:04.551095 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:04.551017 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:04.591219 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:04.590399 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" event={"ID":"5a8c72f19fe67933975dcb887d58f7db","Type":"ContainerStarted","Data":"a561bd0dd5c4595138c3f63e15692c2389d74ccf8aff0f4f5f017e0c0ece3c89"} Apr 22 17:53:05.604337 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:05.603593 2564 generic.go:358] "Generic (PLEG): container finished" podID="5a8c72f19fe67933975dcb887d58f7db" containerID="a561bd0dd5c4595138c3f63e15692c2389d74ccf8aff0f4f5f017e0c0ece3c89" exitCode=0 Apr 22 17:53:05.604337 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:05.603643 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" event={"ID":"5a8c72f19fe67933975dcb887d58f7db","Type":"ContainerDied","Data":"a561bd0dd5c4595138c3f63e15692c2389d74ccf8aff0f4f5f017e0c0ece3c89"} Apr 22 17:53:06.079193 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:06.079156 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:06.079374 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.079319 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:06.079454 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.079383 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:10.079363351 +0000 UTC m=+10.138266477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:06.280802 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:06.280759 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:06.280966 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.280922 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:06.280966 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.280937 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:06.280966 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.280946 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:06.281120 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.280991 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:10.280978106 +0000 UTC m=+10.339881225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:06.551660 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:06.550477 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:06.551660 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.550650 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:06.551660 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:06.550800 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:06.551660 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:06.550921 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:08.548860 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:08.548828 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:08.549301 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:08.548960 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:08.549301 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:08.549015 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:08.549301 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:08.549104 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:10.113707 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:10.113587 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:10.114182 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.113791 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:10.114182 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.113891 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:18.113869944 +0000 UTC m=+18.172773071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:10.315315 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:10.315202 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:10.315521 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.315410 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:10.315521 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.315433 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:10.315521 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.315446 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:10.315521 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.315514 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:18.315496603 +0000 UTC m=+18.374399728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:10.550655 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:10.550054 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:10.550655 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.550161 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:10.550655 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:10.550511 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:10.550655 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:10.550617 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:12.549629 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:12.549595 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:12.550160 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:12.549734 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:12.550160 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:12.550044 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:12.550160 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:12.550141 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:14.548783 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:14.548750 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:14.549243 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:14.548795 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:14.549243 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:14.548888 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:14.549243 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:14.549001 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:16.549724 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:16.549688 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:16.550128 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:16.549806 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:16.550128 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:16.549861 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:16.550128 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:16.549958 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:18.170727 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:18.170694 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:18.171231 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.170845 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:18.171231 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.170912 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.170897179 +0000 UTC m=+34.229800300 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:18.372353 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:18.372318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:18.372524 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.372505 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:18.372587 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.372531 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:18.372587 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.372544 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:18.372657 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.372609 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.372589896 +0000 UTC m=+34.431493034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:18.549617 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:18.549545 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:18.549763 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:18.549545 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:18.549763 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.549662 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:18.549763 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:18.549750 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:20.550039 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.549712 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:20.550624 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.549754 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:20.550624 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:20.550127 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:20.550624 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:20.550200 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:20.630188 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.630162 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" event={"ID":"5a8c72f19fe67933975dcb887d58f7db","Type":"ContainerStarted","Data":"65b37df7628f9c03173174a47696741ac94c645a5ada6f7831e776e0c6dd334a"} Apr 22 17:53:20.631700 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.631589 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3fc6e2c-71c7-4da3-b348-d4a5b505f72a" containerID="42187ed994c0962c79d7f509245234b799ccfd48f2767ff3cf705bf116013117" exitCode=0 Apr 22 17:53:20.631801 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.631698 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerDied","Data":"42187ed994c0962c79d7f509245234b799ccfd48f2767ff3cf705bf116013117"} Apr 22 17:53:20.633200 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.633156 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-f69kk" event={"ID":"16541937-3840-42c5-8bc6-336bcf92d918","Type":"ContainerStarted","Data":"4e8cc52e5ffa403af95a67702d2c2641f0e559b26a6a25b858bfd7479a742c64"} Apr 22 17:53:20.634474 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.634451 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h4787" event={"ID":"8432b695-5ba0-4b5f-bf6e-aea43e93c1a0","Type":"ContainerStarted","Data":"db632dfae3bf7887172059b669a6cef56ec77b46336bd54acc043bac374262d6"} Apr 22 17:53:20.635904 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.635872 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7wtdq" event={"ID":"221a3ce4-df39-49c4-9142-6acb37f99613","Type":"ContainerStarted","Data":"307e90fe749ecd9f46b34d056706ef3d71faceb2dc20210677267fa6f62355ab"} Apr 22 17:53:20.637095 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.637073 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2gj6v" event={"ID":"5b6cdf0f-46d3-4ba2-8a30-8314baac3007","Type":"ContainerStarted","Data":"3475bf455e9a9866fb418d89679079783e4397efc1a45dd6612c20a99b2c47ad"} Apr 22 17:53:20.639863 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.639825 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zs8rm" event={"ID":"dfd9fa90-7a02-4429-a3c2-c939fa96e48e","Type":"ContainerStarted","Data":"110169adbe7a7cb1badb405f4dc40e59dcc43d443ff3c0d49b1964783b9c356c"} Apr 22 17:53:20.641927 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.641908 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" event={"ID":"a98e4a61-2b2f-4865-bb16-7be0e996db98","Type":"ContainerStarted","Data":"080834cd0d83129719a1d41187e23b112ce2861a051c70a2a16551fbfa8a88bb"} Apr 22 17:53:20.644053 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644031 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:53:20.644355 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644321 2564 generic.go:358] "Generic (PLEG): container finished" podID="9d8dc6eb-7c99-4548-8b6d-fe9f31000478" containerID="b161c77e992f9d108f20428a48f90d877a42b8f94175e5ad1701394b367aeefd" exitCode=1 Apr 22 17:53:20.644355 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644353 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"a263fe280edeed0fe26a2c7ffeefaef8e2dbd5d161ffbf3fc05db9a138a7d9eb"} Apr 22 17:53:20.644505 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644373 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"baa80bae2c5734e64aa3208b64bd67c116aa2dc0387db03fd4e270f357457cdc"} Apr 22 17:53:20.644505 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644386 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"c7db7d248bd7fb9c16f17ce21dde87490c258e09b07d149906df91752445696a"} Apr 22 17:53:20.644505 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644399 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerDied","Data":"b161c77e992f9d108f20428a48f90d877a42b8f94175e5ad1701394b367aeefd"} Apr 22 17:53:20.644505 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.644414 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"1123279a018c1f0f142f77c37f8a1c94bab2b844dacb7f7de97431938ed41bf3"} Apr 22 17:53:20.645878 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.645837 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-219.ec2.internal" podStartSLOduration=18.645823595 podStartE2EDuration="18.645823595s" podCreationTimestamp="2026-04-22 17:53:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:20.644838469 +0000 UTC m=+20.703741614" watchObservedRunningTime="2026-04-22 17:53:20.645823595 +0000 UTC m=+20.704726741" Apr 22 17:53:20.676973 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.676928 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h4787" podStartSLOduration=4.113264693 podStartE2EDuration="20.676915569s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.260328622 +0000 UTC m=+3.319231757" lastFinishedPulling="2026-04-22 17:53:19.823979509 +0000 UTC m=+19.882882633" observedRunningTime="2026-04-22 17:53:20.676727393 +0000 UTC m=+20.735630545" watchObservedRunningTime="2026-04-22 17:53:20.676915569 +0000 UTC m=+20.735818705" Apr 22 17:53:20.692266 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.692212 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-f69kk" podStartSLOduration=4.131597031 podStartE2EDuration="20.692196974s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.265610659 +0000 UTC m=+3.324513781" lastFinishedPulling="2026-04-22 17:53:19.82621059 +0000 UTC m=+19.885113724" observedRunningTime="2026-04-22 17:53:20.692044845 +0000 UTC m=+20.750947987" watchObservedRunningTime="2026-04-22 17:53:20.692196974 +0000 UTC m=+20.751100117" Apr 22 17:53:20.705625 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.705584 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zs8rm" podStartSLOduration=4.156620071 podStartE2EDuration="20.7055724s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.265775266 +0000 UTC m=+3.324678389" lastFinishedPulling="2026-04-22 17:53:19.814727594 +0000 UTC m=+19.873630718" observedRunningTime="2026-04-22 17:53:20.705463649 +0000 UTC m=+20.764366789" watchObservedRunningTime="2026-04-22 17:53:20.7055724 +0000 UTC m=+20.764475543" Apr 22 17:53:20.722850 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.722806 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7wtdq" podStartSLOduration=4.152876642 podStartE2EDuration="20.722792396s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.257351296 +0000 UTC m=+3.316254416" lastFinishedPulling="2026-04-22 17:53:19.827267035 +0000 UTC m=+19.886170170" observedRunningTime="2026-04-22 17:53:20.722243065 +0000 UTC m=+20.781146209" watchObservedRunningTime="2026-04-22 17:53:20.722792396 +0000 UTC m=+20.781695539" Apr 22 17:53:20.737762 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.737725 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2gj6v" podStartSLOduration=11.965766874 podStartE2EDuration="20.737713667s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.236315178 +0000 UTC m=+3.295218297" lastFinishedPulling="2026-04-22 17:53:12.008261958 +0000 UTC m=+12.067165090" observedRunningTime="2026-04-22 17:53:20.7374878 +0000 UTC m=+20.796390944" watchObservedRunningTime="2026-04-22 17:53:20.737713667 +0000 UTC m=+20.796616809" Apr 22 17:53:20.898217 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:20.898182 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:21.603283 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:21.603264 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:53:21.647154 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:21.647119 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" event={"ID":"a98e4a61-2b2f-4865-bb16-7be0e996db98","Type":"ContainerStarted","Data":"a40e2ec9a91343442e91633c0fd198929b08c6fed73db495976c999357b08695"} Apr 22 17:53:21.648426 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:21.648402 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8znkx" event={"ID":"ca34e7e4-d295-4bc4-adff-31d08074df10","Type":"ContainerStarted","Data":"7a5077ee978bacb1c7c9bb8dd1429f6562b3991873cead8af24a9a86dd4f7be6"} Apr 22 17:53:21.651331 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:21.651310 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:53:21.653081 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:21.652567 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"60b0d6ca6435aac4a426331d31d23593a9c337df7215d638a8cf4dca84922c4a"} Apr 22 17:53:21.666038 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:21.666003 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8znkx" podStartSLOduration=5.166486576 podStartE2EDuration="21.665993702s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.260439046 +0000 UTC m=+3.319342184" lastFinishedPulling="2026-04-22 17:53:19.759946175 +0000 UTC m=+19.818849310" observedRunningTime="2026-04-22 17:53:21.665752035 +0000 UTC m=+21.724655190" watchObservedRunningTime="2026-04-22 17:53:21.665993702 +0000 UTC m=+21.724896841" Apr 22 17:53:22.510788 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.510691 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:53:21.603279151Z","UUID":"67f29665-f503-4a31-ac7a-88d9be213f6f","Handler":null,"Name":"","Endpoint":""} Apr 22 17:53:22.512396 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.512375 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:53:22.512508 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.512402 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:53:22.548736 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.548714 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:22.548859 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:22.548834 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:22.548937 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.548875 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:22.549030 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:22.549007 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:22.835450 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.835423 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:22.835934 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:22.835913 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:23.658690 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:23.658466 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" event={"ID":"a98e4a61-2b2f-4865-bb16-7be0e996db98","Type":"ContainerStarted","Data":"2288dd8a9b5d2877388c7ced142330accac3cf782b20b2ba3a22fd9d5fd23129"} Apr 22 17:53:23.661438 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:23.661417 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:53:23.661808 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:23.661779 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"607b6ec06f78689586ae042f9163e5d6838b3e8a1b6d7392d4e4379da099a205"} Apr 22 17:53:23.662522 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:23.662458 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2gj6v" Apr 22 17:53:23.679299 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:23.679255 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rf9dm" podStartSLOduration=4.116599112 podStartE2EDuration="23.679242658s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.265808214 +0000 UTC m=+3.324711348" lastFinishedPulling="2026-04-22 17:53:22.828451769 +0000 UTC m=+22.887354894" observedRunningTime="2026-04-22 17:53:23.678183034 +0000 UTC m=+23.737086178" watchObservedRunningTime="2026-04-22 17:53:23.679242658 +0000 UTC m=+23.738145827" Apr 22 17:53:24.552357 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:24.552328 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:24.552936 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:24.552401 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:24.552936 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:24.552523 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:24.552936 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:24.552663 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:25.668317 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.668162 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:53:25.668961 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.668642 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"1be8d42705850fd6dcb3e826e555d0b0135ae48c515b920239e505ac71d60386"} Apr 22 17:53:25.668961 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.668957 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:25.669178 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.669151 2564 scope.go:117] "RemoveContainer" containerID="b161c77e992f9d108f20428a48f90d877a42b8f94175e5ad1701394b367aeefd" Apr 22 17:53:25.670510 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.670485 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3fc6e2c-71c7-4da3-b348-d4a5b505f72a" containerID="5a008db40ad20369f9f046d8a27a154f3ffb071ac302ec13ce14b3a9a1e2a702" exitCode=0 Apr 22 17:53:25.670612 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.670564 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerDied","Data":"5a008db40ad20369f9f046d8a27a154f3ffb071ac302ec13ce14b3a9a1e2a702"} Apr 22 17:53:25.684136 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:25.684117 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:26.552414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.552383 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:26.552643 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.552383 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:26.552643 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:26.552501 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:26.552643 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:26.552560 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:26.677865 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.677840 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:53:26.679723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.679693 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" event={"ID":"9d8dc6eb-7c99-4548-8b6d-fe9f31000478","Type":"ContainerStarted","Data":"1d28744633d725f5262b798da56057478c6b819950f17f4a1fe9abeccc1e881a"} Apr 22 17:53:26.680165 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.680142 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:26.680257 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.680177 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:26.703720 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.703696 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:26.713926 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.713874 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" podStartSLOduration=10.059496825 podStartE2EDuration="26.713859067s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.237829559 +0000 UTC m=+3.296732680" lastFinishedPulling="2026-04-22 17:53:19.892191788 +0000 UTC m=+19.951094922" observedRunningTime="2026-04-22 17:53:26.712235275 +0000 UTC m=+26.771138641" watchObservedRunningTime="2026-04-22 17:53:26.713859067 +0000 UTC m=+26.772762208" Apr 22 17:53:26.964475 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.964431 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t6kpj"] Apr 22 17:53:26.964615 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.964596 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:26.964783 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:26.964748 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:26.965040 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.965018 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kbkmw"] Apr 22 17:53:26.965123 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:26.965110 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:26.965208 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:26.965191 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:27.683615 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:27.683586 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3fc6e2c-71c7-4da3-b348-d4a5b505f72a" containerID="15afdcefe1e3b63ac963531d7fc1e86d423211224189e25e6a8823d769844570" exitCode=0 Apr 22 17:53:27.684095 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:27.683685 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerDied","Data":"15afdcefe1e3b63ac963531d7fc1e86d423211224189e25e6a8823d769844570"} Apr 22 17:53:28.549795 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:28.549572 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:28.549890 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:28.549634 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:28.549932 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:28.549903 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:28.549982 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:28.549949 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:28.687366 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:28.687339 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3fc6e2c-71c7-4da3-b348-d4a5b505f72a" containerID="03a8f6c83b26cb1a67a5b787f6ade588dc268ebbc235dd55af5215351524b6ee" exitCode=0 Apr 22 17:53:28.687692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:28.687421 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerDied","Data":"03a8f6c83b26cb1a67a5b787f6ade588dc268ebbc235dd55af5215351524b6ee"} Apr 22 17:53:30.550537 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:30.550509 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:30.551138 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:30.550615 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:30.551138 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:30.550712 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:30.551138 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:30.550814 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:32.549246 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.549212 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:32.549718 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.549217 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:32.549718 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:32.549310 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kbkmw" podUID="3c596acd-7332-4aab-afbb-73b8773fb825" Apr 22 17:53:32.549718 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:32.549406 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6kpj" podUID="1273b1fd-25f6-4315-a692-c599fb3e48b7" Apr 22 17:53:32.790771 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.790739 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-219.ec2.internal" event="NodeReady" Apr 22 17:53:32.790973 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.790889 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:53:32.828483 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.828408 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b55dbf8cb-pn2vz"] Apr 22 17:53:32.857996 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.857971 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vzgzr"] Apr 22 17:53:32.858156 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.858119 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.861739 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.861712 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:53:32.861875 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.861775 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tlhcz\"" Apr 22 17:53:32.861989 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.861973 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:53:32.862746 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.862622 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:53:32.866380 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.866357 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:53:32.873433 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.873412 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b55dbf8cb-pn2vz"] Apr 22 17:53:32.873542 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.873441 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8825w"] Apr 22 17:53:32.873630 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.873609 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:32.876208 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.876179 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:53:32.876299 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.876226 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:53:32.876299 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.876227 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:53:32.876299 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.876283 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dchxj\"" Apr 22 17:53:32.886518 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.886496 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vzgzr"] Apr 22 17:53:32.886596 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.886526 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8825w"] Apr 22 17:53:32.886596 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.886592 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8825w" Apr 22 17:53:32.888973 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.888954 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:53:32.889144 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.889117 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:53:32.889219 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.889155 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rhqnx\"" Apr 22 17:53:32.986461 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986429 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-installation-pull-secrets\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.986641 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986470 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-config-volume\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:32.986641 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986491 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/512ab457-9185-4f8b-a45f-516b6ec97f63-ca-trust-extracted\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.986641 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-image-registry-private-configuration\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.986641 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986620 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-trusted-ca\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.986876 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986643 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6s6\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-kube-api-access-5q6s6\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.986876 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986697 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-tmp-dir\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:32.986876 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986729 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phnps\" (UniqueName: \"kubernetes.io/projected/3400960b-c044-44c8-b84c-550071e3f93e-kube-api-access-phnps\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:32.986876 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986756 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmcrn\" (UniqueName: \"kubernetes.io/projected/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-kube-api-access-kmcrn\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:32.986876 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986815 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-bound-sa-token\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.986876 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986857 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.987173 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986908 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:32.987173 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986934 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-certificates\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:32.987173 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:32.986988 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.088322 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088242 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-installation-pull-secrets\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088322 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-config-volume\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.088322 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088320 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/512ab457-9185-4f8b-a45f-516b6ec97f63-ca-trust-extracted\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088342 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-image-registry-private-configuration\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088359 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-trusted-ca\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088374 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6s6\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-kube-api-access-5q6s6\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-tmp-dir\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088423 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phnps\" (UniqueName: \"kubernetes.io/projected/3400960b-c044-44c8-b84c-550071e3f93e-kube-api-access-phnps\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmcrn\" (UniqueName: \"kubernetes.io/projected/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-kube-api-access-kmcrn\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088502 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-bound-sa-token\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088530 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.088578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088579 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:33.089067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-certificates\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.089067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088634 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.089067 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.088759 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:33.089067 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.088777 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:33.089067 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.088836 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls podName:3a00fffd-ba82-45c7-b379-68e21fd2f1f1 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:33.588818542 +0000 UTC m=+33.647721679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls") pod "dns-default-8825w" (UID: "3a00fffd-ba82-45c7-b379-68e21fd2f1f1") : secret "dns-default-metrics-tls" not found Apr 22 17:53:33.089067 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.088970 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-config-volume\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.089373 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.089186 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-tmp-dir\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.089373 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.088781 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b55dbf8cb-pn2vz: secret "image-registry-tls" not found Apr 22 17:53:33.089373 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.089253 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls podName:512ab457-9185-4f8b-a45f-516b6ec97f63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:33.58923729 +0000 UTC m=+33.648140410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls") pod "image-registry-7b55dbf8cb-pn2vz" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63") : secret "image-registry-tls" not found Apr 22 17:53:33.089373 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.089292 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/512ab457-9185-4f8b-a45f-516b6ec97f63-ca-trust-extracted\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.089373 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.089307 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:33.089622 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.089384 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert podName:3400960b-c044-44c8-b84c-550071e3f93e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:33.589366425 +0000 UTC m=+33.648269545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert") pod "ingress-canary-vzgzr" (UID: "3400960b-c044-44c8-b84c-550071e3f93e") : secret "canary-serving-cert" not found Apr 22 17:53:33.089622 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.089532 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-certificates\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.089622 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.089589 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-trusted-ca\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.092679 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.092644 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-installation-pull-secrets\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.092784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.092644 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-image-registry-private-configuration\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.100170 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.100147 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phnps\" (UniqueName: \"kubernetes.io/projected/3400960b-c044-44c8-b84c-550071e3f93e-kube-api-access-phnps\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:33.100528 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.100486 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmcrn\" (UniqueName: \"kubernetes.io/projected/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-kube-api-access-kmcrn\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.100628 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.100575 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-bound-sa-token\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.100819 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.100797 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6s6\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-kube-api-access-5q6s6\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.591943 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.591900 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.591954 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:33.592037 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592061 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592127 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592138 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert podName:3400960b-c044-44c8-b84c-550071e3f93e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.592121866 +0000 UTC m=+34.651024987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert") pod "ingress-canary-vzgzr" (UID: "3400960b-c044-44c8-b84c-550071e3f93e") : secret "canary-serving-cert" not found Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592197 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls podName:3a00fffd-ba82-45c7-b379-68e21fd2f1f1 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.592179381 +0000 UTC m=+34.651082514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls") pod "dns-default-8825w" (UID: "3a00fffd-ba82-45c7-b379-68e21fd2f1f1") : secret "dns-default-metrics-tls" not found Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592134 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592225 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b55dbf8cb-pn2vz: secret "image-registry-tls" not found Apr 22 17:53:33.592567 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:33.592268 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls podName:512ab457-9185-4f8b-a45f-516b6ec97f63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.592256693 +0000 UTC m=+34.651159828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls") pod "image-registry-7b55dbf8cb-pn2vz" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63") : secret "image-registry-tls" not found Apr 22 17:53:34.196309 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.196275 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:34.196543 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.196423 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:34.196543 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.196486 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs podName:1273b1fd-25f6-4315-a692-c599fb3e48b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:06.19646986 +0000 UTC m=+66.255372998 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs") pod "network-metrics-daemon-t6kpj" (UID: "1273b1fd-25f6-4315-a692-c599fb3e48b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:34.398387 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.398358 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:34.398529 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.398511 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:34.398572 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.398534 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:34.398572 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.398544 2564 projected.go:194] Error preparing data for projected volume kube-api-access-8wwsb for pod openshift-network-diagnostics/network-check-target-kbkmw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:34.398632 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.398591 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb podName:3c596acd-7332-4aab-afbb-73b8773fb825 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:06.398577552 +0000 UTC m=+66.457480673 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wwsb" (UniqueName: "kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb") pod "network-check-target-kbkmw" (UID: "3c596acd-7332-4aab-afbb-73b8773fb825") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:34.548891 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.548831 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:53:34.548891 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.548849 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:53:34.551704 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.551686 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7sslk\"" Apr 22 17:53:34.552420 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.552405 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:53:34.552498 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.552429 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9cn8\"" Apr 22 17:53:34.552498 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.552443 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:53:34.552498 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.552405 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:53:34.600600 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.600580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.600609 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:34.600658 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600737 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600770 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600795 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert podName:3400960b-c044-44c8-b84c-550071e3f93e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:36.600778046 +0000 UTC m=+36.659681587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert") pod "ingress-canary-vzgzr" (UID: "3400960b-c044-44c8-b84c-550071e3f93e") : secret "canary-serving-cert" not found Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600827 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls podName:3a00fffd-ba82-45c7-b379-68e21fd2f1f1 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:36.600806432 +0000 UTC m=+36.659709557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls") pod "dns-default-8825w" (UID: "3a00fffd-ba82-45c7-b379-68e21fd2f1f1") : secret "dns-default-metrics-tls" not found Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600776 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600846 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b55dbf8cb-pn2vz: secret "image-registry-tls" not found Apr 22 17:53:34.600942 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:34.600884 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls podName:512ab457-9185-4f8b-a45f-516b6ec97f63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:36.600874333 +0000 UTC m=+36.659777452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls") pod "image-registry-7b55dbf8cb-pn2vz" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63") : secret "image-registry-tls" not found Apr 22 17:53:35.702741 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:35.702532 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3fc6e2c-71c7-4da3-b348-d4a5b505f72a" containerID="d22455f83f5301e0989d5abeee50296c6f18c6a68a088e8680b96c7428319ca3" exitCode=0 Apr 22 17:53:35.703122 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:35.702568 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerDied","Data":"d22455f83f5301e0989d5abeee50296c6f18c6a68a088e8680b96c7428319ca3"} Apr 22 17:53:36.613874 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:36.613841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:36.614019 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:36.613906 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:36.614019 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:36.613936 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:36.614019 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.613985 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:36.614019 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.614000 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b55dbf8cb-pn2vz: secret "image-registry-tls" not found Apr 22 17:53:36.614176 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.614023 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:36.614176 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.614043 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:36.614176 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.614059 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls podName:512ab457-9185-4f8b-a45f-516b6ec97f63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.614041693 +0000 UTC m=+40.672944830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls") pod "image-registry-7b55dbf8cb-pn2vz" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63") : secret "image-registry-tls" not found Apr 22 17:53:36.614176 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.614074 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls podName:3a00fffd-ba82-45c7-b379-68e21fd2f1f1 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.614068464 +0000 UTC m=+40.672971584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls") pod "dns-default-8825w" (UID: "3a00fffd-ba82-45c7-b379-68e21fd2f1f1") : secret "dns-default-metrics-tls" not found Apr 22 17:53:36.614176 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:36.614090 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert podName:3400960b-c044-44c8-b84c-550071e3f93e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.614077546 +0000 UTC m=+40.672980665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert") pod "ingress-canary-vzgzr" (UID: "3400960b-c044-44c8-b84c-550071e3f93e") : secret "canary-serving-cert" not found Apr 22 17:53:36.707189 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:36.707162 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3fc6e2c-71c7-4da3-b348-d4a5b505f72a" containerID="cdf47c178dd795fc93e485f82fd92fce36bcf4b3962f7375512a76dab49d6756" exitCode=0 Apr 22 17:53:36.707496 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:36.707199 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerDied","Data":"cdf47c178dd795fc93e485f82fd92fce36bcf4b3962f7375512a76dab49d6756"} Apr 22 17:53:37.711505 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:37.711476 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kz84r" event={"ID":"a3fc6e2c-71c7-4da3-b348-d4a5b505f72a","Type":"ContainerStarted","Data":"a7849e2dc95d045d50dce94de79c2475c068050c4b8bdef9d5e5330e8785534b"} Apr 22 17:53:37.733759 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:37.733715 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kz84r" podStartSLOduration=6.3726157820000005 podStartE2EDuration="37.733702745s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:03.265530402 +0000 UTC m=+3.324433522" lastFinishedPulling="2026-04-22 17:53:34.626617365 +0000 UTC m=+34.685520485" observedRunningTime="2026-04-22 17:53:37.732482221 +0000 UTC m=+37.791385373" watchObservedRunningTime="2026-04-22 17:53:37.733702745 +0000 UTC m=+37.792605886" Apr 22 17:53:39.059058 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.059022 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9w5wv"] Apr 22 17:53:39.061827 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.061807 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.064792 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.064772 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:53:39.072620 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.072597 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9w5wv"] Apr 22 17:53:39.232682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.232625 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-dbus\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.232682 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.232678 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-original-pull-secret\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.232854 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.232720 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-kubelet-config\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.333844 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.333770 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-dbus\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.333844 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.333806 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-original-pull-secret\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.333844 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.333833 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-kubelet-config\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.334093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.333987 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-kubelet-config\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.334093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.334004 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-dbus\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.336873 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.336853 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3db8c3e-749a-4bb4-b86c-667f4524c8fa-original-pull-secret\") pod \"global-pull-secret-syncer-9w5wv\" (UID: \"a3db8c3e-749a-4bb4-b86c-667f4524c8fa\") " pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.370178 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.370159 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9w5wv" Apr 22 17:53:39.491772 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.491744 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9w5wv"] Apr 22 17:53:39.495198 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:39.495166 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3db8c3e_749a_4bb4_b86c_667f4524c8fa.slice/crio-ec7a2d6deef43dbde5b82e0519b8b8a20ccf9768927693baf22fa91c289335e3 WatchSource:0}: Error finding container ec7a2d6deef43dbde5b82e0519b8b8a20ccf9768927693baf22fa91c289335e3: Status 404 returned error can't find the container with id ec7a2d6deef43dbde5b82e0519b8b8a20ccf9768927693baf22fa91c289335e3 Apr 22 17:53:39.716096 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.716069 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9w5wv" event={"ID":"a3db8c3e-749a-4bb4-b86c-667f4524c8fa","Type":"ContainerStarted","Data":"ec7a2d6deef43dbde5b82e0519b8b8a20ccf9768927693baf22fa91c289335e3"} Apr 22 17:53:39.980822 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.980732 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9"] Apr 22 17:53:39.983968 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.983939 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" Apr 22 17:53:39.986509 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.986462 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.986623 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.986586 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wvd5w\"" Apr 22 17:53:39.986706 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.986594 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.994328 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:39.994299 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9"] Apr 22 17:53:40.141305 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.141270 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnfp\" (UniqueName: \"kubernetes.io/projected/de69c135-af81-4f57-8071-29c9454db61d-kube-api-access-ncnfp\") pod \"migrator-74bb7799d9-hkbk9\" (UID: \"de69c135-af81-4f57-8071-29c9454db61d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" Apr 22 17:53:40.242510 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.242437 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnfp\" (UniqueName: \"kubernetes.io/projected/de69c135-af81-4f57-8071-29c9454db61d-kube-api-access-ncnfp\") pod \"migrator-74bb7799d9-hkbk9\" (UID: \"de69c135-af81-4f57-8071-29c9454db61d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" Apr 22 17:53:40.254509 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.254482 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnfp\" (UniqueName: \"kubernetes.io/projected/de69c135-af81-4f57-8071-29c9454db61d-kube-api-access-ncnfp\") pod \"migrator-74bb7799d9-hkbk9\" (UID: \"de69c135-af81-4f57-8071-29c9454db61d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" Apr 22 17:53:40.295215 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.295192 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" Apr 22 17:53:40.424429 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.424400 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9"] Apr 22 17:53:40.428383 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:40.428356 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde69c135_af81_4f57_8071_29c9454db61d.slice/crio-9175d7f19646efd43face11d1bc4ec40fb3d1eca9f018fac987977f41473e852 WatchSource:0}: Error finding container 9175d7f19646efd43face11d1bc4ec40fb3d1eca9f018fac987977f41473e852: Status 404 returned error can't find the container with id 9175d7f19646efd43face11d1bc4ec40fb3d1eca9f018fac987977f41473e852 Apr 22 17:53:40.538089 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.538023 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj"] Apr 22 17:53:40.543972 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.543949 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:40.546932 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.546907 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 17:53:40.547060 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.546950 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 17:53:40.547479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.547316 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7xtkr\"" Apr 22 17:53:40.548596 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.548574 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj"] Apr 22 17:53:40.645773 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.645745 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.645790 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.645821 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.645844 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65900b80-a3e4-4f4d-bb90-dc5cac183f53-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.645873 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.645910 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.645913 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.645936 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b55dbf8cb-pn2vz: secret "image-registry-tls" not found Apr 22 17:53:40.645972 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.645970 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:40.646440 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.645983 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert podName:3400960b-c044-44c8-b84c-550071e3f93e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:48.645963895 +0000 UTC m=+48.704867019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert") pod "ingress-canary-vzgzr" (UID: "3400960b-c044-44c8-b84c-550071e3f93e") : secret "canary-serving-cert" not found Apr 22 17:53:40.646440 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.646061 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls podName:3a00fffd-ba82-45c7-b379-68e21fd2f1f1 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:48.646046151 +0000 UTC m=+48.704949274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls") pod "dns-default-8825w" (UID: "3a00fffd-ba82-45c7-b379-68e21fd2f1f1") : secret "dns-default-metrics-tls" not found Apr 22 17:53:40.646440 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.646084 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls podName:512ab457-9185-4f8b-a45f-516b6ec97f63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:48.646073825 +0000 UTC m=+48.704976950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls") pod "image-registry-7b55dbf8cb-pn2vz" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63") : secret "image-registry-tls" not found Apr 22 17:53:40.718780 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.718746 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" event={"ID":"de69c135-af81-4f57-8071-29c9454db61d","Type":"ContainerStarted","Data":"9175d7f19646efd43face11d1bc4ec40fb3d1eca9f018fac987977f41473e852"} Apr 22 17:53:40.747117 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.747090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65900b80-a3e4-4f4d-bb90-dc5cac183f53-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:40.747255 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.747127 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:40.747323 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.747254 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:53:40.747323 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:40.747317 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert podName:65900b80-a3e4-4f4d-bb90-dc5cac183f53 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.247298523 +0000 UTC m=+41.306201647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-gbrpj" (UID: "65900b80-a3e4-4f4d-bb90-dc5cac183f53") : secret "networking-console-plugin-cert" not found Apr 22 17:53:40.747907 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:40.747886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65900b80-a3e4-4f4d-bb90-dc5cac183f53-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:41.217054 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.217028 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h4787_8432b695-5ba0-4b5f-bf6e-aea43e93c1a0/dns-node-resolver/0.log" Apr 22 17:53:41.251723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.251691 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:41.254867 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:41.252166 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:53:41.254867 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:41.252250 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert podName:65900b80-a3e4-4f4d-bb90-dc5cac183f53 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:42.252229195 +0000 UTC m=+42.311132336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-gbrpj" (UID: "65900b80-a3e4-4f4d-bb90-dc5cac183f53") : secret "networking-console-plugin-cert" not found Apr 22 17:53:41.370499 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.370458 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-clh2q"] Apr 22 17:53:41.374702 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.374660 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.377394 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.377261 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:53:41.377394 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.377339 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dqcf6\"" Apr 22 17:53:41.377394 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.377359 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:53:41.377603 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.377368 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:53:41.377603 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.377478 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:53:41.383183 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.383145 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-clh2q"] Apr 22 17:53:41.554715 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.554692 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/61075974-e581-432e-8332-a5b8e03775a9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.554832 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.554728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.554832 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.554768 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fqr\" (UniqueName: \"kubernetes.io/projected/61075974-e581-432e-8332-a5b8e03775a9-kube-api-access-27fqr\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.554934 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.554840 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/61075974-e581-432e-8332-a5b8e03775a9-crio-socket\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.554934 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.554864 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/61075974-e581-432e-8332-a5b8e03775a9-data-volume\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.655624 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.655595 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/61075974-e581-432e-8332-a5b8e03775a9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.655760 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.655645 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.655760 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.655720 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27fqr\" (UniqueName: \"kubernetes.io/projected/61075974-e581-432e-8332-a5b8e03775a9-kube-api-access-27fqr\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.655872 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:41.655775 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:41.655872 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.655804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/61075974-e581-432e-8332-a5b8e03775a9-crio-socket\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.655872 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:41.655843 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls podName:61075974-e581-432e-8332-a5b8e03775a9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:42.155822546 +0000 UTC m=+42.214725669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-clh2q" (UID: "61075974-e581-432e-8332-a5b8e03775a9") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:41.656003 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.655869 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/61075974-e581-432e-8332-a5b8e03775a9-data-volume\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.656003 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.655975 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/61075974-e581-432e-8332-a5b8e03775a9-crio-socket\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.656230 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.656206 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/61075974-e581-432e-8332-a5b8e03775a9-data-volume\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.656230 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.656219 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/61075974-e581-432e-8332-a5b8e03775a9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.665518 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.665495 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fqr\" (UniqueName: \"kubernetes.io/projected/61075974-e581-432e-8332-a5b8e03775a9-kube-api-access-27fqr\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:41.722708 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.722682 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" event={"ID":"de69c135-af81-4f57-8071-29c9454db61d","Type":"ContainerStarted","Data":"3e554650acc101ca16c0c58fa7bdb66d1f962d8660bce211c2e6bce1ed7985fb"} Apr 22 17:53:41.722803 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.722715 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" event={"ID":"de69c135-af81-4f57-8071-29c9454db61d","Type":"ContainerStarted","Data":"0ba77be3ac47272d46fce8b3963232bf448cc83657fca361c99873e39fa1dd0d"} Apr 22 17:53:41.740118 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:41.740077 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkbk9" podStartSLOduration=1.6447568289999999 podStartE2EDuration="2.740066132s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.430661825 +0000 UTC m=+40.489564958" lastFinishedPulling="2026-04-22 17:53:41.525971138 +0000 UTC m=+41.584874261" observedRunningTime="2026-04-22 17:53:41.738900188 +0000 UTC m=+41.797803332" watchObservedRunningTime="2026-04-22 17:53:41.740066132 +0000 UTC m=+41.798969295" Apr 22 17:53:42.158910 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.158874 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:42.159090 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:42.159039 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:42.159142 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:42.159104 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls podName:61075974-e581-432e-8332-a5b8e03775a9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:43.159084608 +0000 UTC m=+43.217987732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-clh2q" (UID: "61075974-e581-432e-8332-a5b8e03775a9") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:42.259579 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.259542 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:42.260019 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:42.259643 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:53:42.260019 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:42.259727 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert podName:65900b80-a3e4-4f4d-bb90-dc5cac183f53 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:44.259708281 +0000 UTC m=+44.318611418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-gbrpj" (UID: "65900b80-a3e4-4f4d-bb90-dc5cac183f53") : secret "networking-console-plugin-cert" not found Apr 22 17:53:42.620096 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.620072 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zs8rm_dfd9fa90-7a02-4429-a3c2-c939fa96e48e/node-ca/0.log" Apr 22 17:53:42.635725 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.635704 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xlll9"] Apr 22 17:53:42.638763 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.638748 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.641499 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.641472 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 17:53:42.641603 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.641536 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 17:53:42.641603 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.641553 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-gm9fc\"" Apr 22 17:53:42.642526 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.642507 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 17:53:42.642622 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.642529 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 17:53:42.648163 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.648145 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xlll9"] Apr 22 17:53:42.764269 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.764244 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6nt4\" (UniqueName: \"kubernetes.io/projected/870f759a-c4ed-4cef-9781-15daf7724014-kube-api-access-n6nt4\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.764417 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.764382 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/870f759a-c4ed-4cef-9781-15daf7724014-signing-cabundle\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.764479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.764416 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/870f759a-c4ed-4cef-9781-15daf7724014-signing-key\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.864811 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.864779 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/870f759a-c4ed-4cef-9781-15daf7724014-signing-key\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.865017 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.864859 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6nt4\" (UniqueName: \"kubernetes.io/projected/870f759a-c4ed-4cef-9781-15daf7724014-kube-api-access-n6nt4\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.865017 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.864999 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/870f759a-c4ed-4cef-9781-15daf7724014-signing-cabundle\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.865701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.865659 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/870f759a-c4ed-4cef-9781-15daf7724014-signing-cabundle\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.867324 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.867305 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/870f759a-c4ed-4cef-9781-15daf7724014-signing-key\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.875538 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.875483 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6nt4\" (UniqueName: \"kubernetes.io/projected/870f759a-c4ed-4cef-9781-15daf7724014-kube-api-access-n6nt4\") pod \"service-ca-865cb79987-xlll9\" (UID: \"870f759a-c4ed-4cef-9781-15daf7724014\") " pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:42.949559 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:42.949526 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xlll9" Apr 22 17:53:43.132581 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:43.132512 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xlll9"] Apr 22 17:53:43.136487 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:43.136459 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870f759a_c4ed_4cef_9781_15daf7724014.slice/crio-f72c9716819027669d835aeca1f5f8d844421669b55de03f59d0b219c8343f96 WatchSource:0}: Error finding container f72c9716819027669d835aeca1f5f8d844421669b55de03f59d0b219c8343f96: Status 404 returned error can't find the container with id f72c9716819027669d835aeca1f5f8d844421669b55de03f59d0b219c8343f96 Apr 22 17:53:43.168563 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:43.168539 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:43.168712 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:43.168695 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:43.168768 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:43.168757 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls podName:61075974-e581-432e-8332-a5b8e03775a9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:45.168739495 +0000 UTC m=+45.227642631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-clh2q" (UID: "61075974-e581-432e-8332-a5b8e03775a9") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:43.728634 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:43.728600 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9w5wv" event={"ID":"a3db8c3e-749a-4bb4-b86c-667f4524c8fa","Type":"ContainerStarted","Data":"255999780ca3a5f66caad2551493d702a727f0a441ca992ef89f2220747abc1e"} Apr 22 17:53:43.729850 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:43.729826 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xlll9" event={"ID":"870f759a-c4ed-4cef-9781-15daf7724014","Type":"ContainerStarted","Data":"f72c9716819027669d835aeca1f5f8d844421669b55de03f59d0b219c8343f96"} Apr 22 17:53:43.749081 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:43.749037 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9w5wv" podStartSLOduration=1.174788427 podStartE2EDuration="4.74902546s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:39.497315383 +0000 UTC m=+39.556218503" lastFinishedPulling="2026-04-22 17:53:43.071552416 +0000 UTC m=+43.130455536" observedRunningTime="2026-04-22 17:53:43.748735498 +0000 UTC m=+43.807638640" watchObservedRunningTime="2026-04-22 17:53:43.74902546 +0000 UTC m=+43.807928602" Apr 22 17:53:44.277263 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:44.277226 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:44.277432 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:44.277407 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:53:44.277513 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:44.277484 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert podName:65900b80-a3e4-4f4d-bb90-dc5cac183f53 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:48.277464254 +0000 UTC m=+48.336367373 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-gbrpj" (UID: "65900b80-a3e4-4f4d-bb90-dc5cac183f53") : secret "networking-console-plugin-cert" not found Apr 22 17:53:45.184967 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:45.184895 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:45.185332 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:45.185063 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:45.185332 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:45.185132 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls podName:61075974-e581-432e-8332-a5b8e03775a9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:49.18511262 +0000 UTC m=+49.244015740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-clh2q" (UID: "61075974-e581-432e-8332-a5b8e03775a9") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:45.735525 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:45.735494 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xlll9" event={"ID":"870f759a-c4ed-4cef-9781-15daf7724014","Type":"ContainerStarted","Data":"5a11430b57f0c6515cc7678d930e420bb7107142a0ac0f312c684ec1727b552e"} Apr 22 17:53:45.755301 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:45.755248 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xlll9" podStartSLOduration=1.875276663 podStartE2EDuration="3.755233453s" podCreationTimestamp="2026-04-22 17:53:42 +0000 UTC" firstStartedPulling="2026-04-22 17:53:43.138473059 +0000 UTC m=+43.197376184" lastFinishedPulling="2026-04-22 17:53:45.018429838 +0000 UTC m=+45.077332974" observedRunningTime="2026-04-22 17:53:45.754565758 +0000 UTC m=+45.813468896" watchObservedRunningTime="2026-04-22 17:53:45.755233453 +0000 UTC m=+45.814136594" Apr 22 17:53:48.308106 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:48.308068 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:48.308543 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.308217 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:53:48.308543 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.308279 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert podName:65900b80-a3e4-4f4d-bb90-dc5cac183f53 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.308263319 +0000 UTC m=+56.367166438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-gbrpj" (UID: "65900b80-a3e4-4f4d-bb90-dc5cac183f53") : secret "networking-console-plugin-cert" not found Apr 22 17:53:48.710542 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:48.710512 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") pod \"image-registry-7b55dbf8cb-pn2vz\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:53:48.710702 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:48.710554 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:53:48.710702 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:48.710574 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:53:48.710702 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710652 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:48.710702 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710685 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b55dbf8cb-pn2vz: secret "image-registry-tls" not found Apr 22 17:53:48.710702 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710686 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:48.710860 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710705 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:48.710860 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710745 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls podName:512ab457-9185-4f8b-a45f-516b6ec97f63 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:04.710727423 +0000 UTC m=+64.769630548 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls") pod "image-registry-7b55dbf8cb-pn2vz" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63") : secret "image-registry-tls" not found Apr 22 17:53:48.710860 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710760 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls podName:3a00fffd-ba82-45c7-b379-68e21fd2f1f1 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:04.710752601 +0000 UTC m=+64.769655720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls") pod "dns-default-8825w" (UID: "3a00fffd-ba82-45c7-b379-68e21fd2f1f1") : secret "dns-default-metrics-tls" not found Apr 22 17:53:48.710860 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:48.710775 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert podName:3400960b-c044-44c8-b84c-550071e3f93e nodeName:}" failed. No retries permitted until 2026-04-22 17:54:04.710767942 +0000 UTC m=+64.769671061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert") pod "ingress-canary-vzgzr" (UID: "3400960b-c044-44c8-b84c-550071e3f93e") : secret "canary-serving-cert" not found Apr 22 17:53:49.214764 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:49.214731 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:49.214929 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:49.214901 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:49.214990 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:53:49.214973 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls podName:61075974-e581-432e-8332-a5b8e03775a9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.214954681 +0000 UTC m=+57.273857816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-clh2q" (UID: "61075974-e581-432e-8332-a5b8e03775a9") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:56.372258 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:56.372225 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:56.374414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:56.374394 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65900b80-a3e4-4f4d-bb90-dc5cac183f53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gbrpj\" (UID: \"65900b80-a3e4-4f4d-bb90-dc5cac183f53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:56.455516 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:56.455492 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" Apr 22 17:53:56.567402 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:56.567373 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj"] Apr 22 17:53:56.570790 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:56.570760 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65900b80_a3e4_4f4d_bb90_dc5cac183f53.slice/crio-339950af4aba54fa97d0119aed3736b883bbcb9d122c2cb258436e1f2edfc83a WatchSource:0}: Error finding container 339950af4aba54fa97d0119aed3736b883bbcb9d122c2cb258436e1f2edfc83a: Status 404 returned error can't find the container with id 339950af4aba54fa97d0119aed3736b883bbcb9d122c2cb258436e1f2edfc83a Apr 22 17:53:56.756188 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:56.756148 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" event={"ID":"65900b80-a3e4-4f4d-bb90-dc5cac183f53","Type":"ContainerStarted","Data":"339950af4aba54fa97d0119aed3736b883bbcb9d122c2cb258436e1f2edfc83a"} Apr 22 17:53:57.277516 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.277478 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:57.280228 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.280202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/61075974-e581-432e-8332-a5b8e03775a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-clh2q\" (UID: \"61075974-e581-432e-8332-a5b8e03775a9\") " pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:57.285117 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.285095 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-clh2q" Apr 22 17:53:57.616818 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.616796 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-clh2q"] Apr 22 17:53:57.620144 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:53:57.620121 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61075974_e581_432e_8332_a5b8e03775a9.slice/crio-25ef78eb048c948baa71f6f42a33ac7dbd7605fd678752721e3d990ad287dc76 WatchSource:0}: Error finding container 25ef78eb048c948baa71f6f42a33ac7dbd7605fd678752721e3d990ad287dc76: Status 404 returned error can't find the container with id 25ef78eb048c948baa71f6f42a33ac7dbd7605fd678752721e3d990ad287dc76 Apr 22 17:53:57.760425 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.760390 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-clh2q" event={"ID":"61075974-e581-432e-8332-a5b8e03775a9","Type":"ContainerStarted","Data":"3e6613b488e7354dfcae24cc0879964e7aafb49ed4659dec1ef6c3da11ebcd66"} Apr 22 17:53:57.760425 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.760428 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-clh2q" event={"ID":"61075974-e581-432e-8332-a5b8e03775a9","Type":"ContainerStarted","Data":"25ef78eb048c948baa71f6f42a33ac7dbd7605fd678752721e3d990ad287dc76"} Apr 22 17:53:57.761574 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.761550 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" event={"ID":"65900b80-a3e4-4f4d-bb90-dc5cac183f53","Type":"ContainerStarted","Data":"2752855c40aec04f14d7861716b7891b96b948e34c63ab4e3a120fe569c6de63"} Apr 22 17:53:57.777139 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:57.777097 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gbrpj" podStartSLOduration=16.797069186999998 podStartE2EDuration="17.777085114s" podCreationTimestamp="2026-04-22 17:53:40 +0000 UTC" firstStartedPulling="2026-04-22 17:53:56.57257229 +0000 UTC m=+56.631475409" lastFinishedPulling="2026-04-22 17:53:57.5525882 +0000 UTC m=+57.611491336" observedRunningTime="2026-04-22 17:53:57.776149324 +0000 UTC m=+57.835052488" watchObservedRunningTime="2026-04-22 17:53:57.777085114 +0000 UTC m=+57.835988255" Apr 22 17:53:58.697701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:58.697655 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vgh6" Apr 22 17:53:58.765680 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:53:58.765640 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-clh2q" event={"ID":"61075974-e581-432e-8332-a5b8e03775a9","Type":"ContainerStarted","Data":"2b37d1d047ed492d98a8f041f64c515382d8632353ff0eeff7eae8135752fd20"} Apr 22 17:54:00.773609 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:00.773575 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-clh2q" event={"ID":"61075974-e581-432e-8332-a5b8e03775a9","Type":"ContainerStarted","Data":"7be40217838c069b14e39c0c302d086a7684b3b806b36e08638c82ced43938aa"} Apr 22 17:54:00.792260 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:00.792218 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-clh2q" podStartSLOduration=17.143058035 podStartE2EDuration="19.792202749s" podCreationTimestamp="2026-04-22 17:53:41 +0000 UTC" firstStartedPulling="2026-04-22 17:53:57.673723618 +0000 UTC m=+57.732626745" lastFinishedPulling="2026-04-22 17:54:00.322868335 +0000 UTC m=+60.381771459" observedRunningTime="2026-04-22 17:54:00.791308883 +0000 UTC m=+60.850212024" watchObservedRunningTime="2026-04-22 17:54:00.792202749 +0000 UTC m=+60.851105915" Apr 22 17:54:01.856596 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:01.856565 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b55dbf8cb-pn2vz"] Apr 22 17:54:01.857076 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:54:01.856727 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" podUID="512ab457-9185-4f8b-a45f-516b6ec97f63" Apr 22 17:54:01.922161 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:01.922101 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6fbdf466c4-c4xwm"] Apr 22 17:54:01.926336 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:01.926318 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:01.938414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:01.938395 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fbdf466c4-c4xwm"] Apr 22 17:54:02.014815 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.014787 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn6p\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-kube-api-access-hsn6p\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.014933 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.014824 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-image-registry-private-configuration\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.014933 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.014849 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-trusted-ca\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.014933 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.014920 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-registry-certificates\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.015031 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.014979 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-registry-tls\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.015089 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.015073 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-ca-trust-extracted\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.015122 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.015097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-installation-pull-secrets\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.015153 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.015123 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-bound-sa-token\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116046 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.115991 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-image-registry-private-configuration\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116046 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116017 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-trusted-ca\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116046 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116043 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-registry-certificates\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116242 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116063 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-registry-tls\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116242 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116131 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-ca-trust-extracted\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116242 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116147 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-installation-pull-secrets\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116242 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116174 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-bound-sa-token\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116242 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116218 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn6p\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-kube-api-access-hsn6p\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116591 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116555 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-ca-trust-extracted\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.116996 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.116973 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-registry-certificates\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.117267 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.117247 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-trusted-ca\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.118421 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.118389 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-image-registry-private-configuration\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.118523 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.118464 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-installation-pull-secrets\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.118620 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.118599 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-registry-tls\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.124569 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.124549 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn6p\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-kube-api-access-hsn6p\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.125857 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.125836 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31eed0e-7ff0-4e55-9e16-7fd5c607e632-bound-sa-token\") pod \"image-registry-6fbdf466c4-c4xwm\" (UID: \"a31eed0e-7ff0-4e55-9e16-7fd5c607e632\") " pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.237396 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.237371 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tlhcz\"" Apr 22 17:54:02.245829 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.245809 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:02.366009 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.365978 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fbdf466c4-c4xwm"] Apr 22 17:54:02.370410 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:02.370379 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda31eed0e_7ff0_4e55_9e16_7fd5c607e632.slice/crio-d1fd01401afd5ee9f64b7cbac6352a7b382f6cf83071423b8f74a4d87103cac3 WatchSource:0}: Error finding container d1fd01401afd5ee9f64b7cbac6352a7b382f6cf83071423b8f74a4d87103cac3: Status 404 returned error can't find the container with id d1fd01401afd5ee9f64b7cbac6352a7b382f6cf83071423b8f74a4d87103cac3 Apr 22 17:54:02.782424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.782385 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" event={"ID":"a31eed0e-7ff0-4e55-9e16-7fd5c607e632","Type":"ContainerStarted","Data":"e04152c83efd8b45f1f35e63e60c9fa0ad4fb1395d4d0e19ef549105cd10c0cc"} Apr 22 17:54:02.782424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.782415 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:54:02.782424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.782424 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" event={"ID":"a31eed0e-7ff0-4e55-9e16-7fd5c607e632","Type":"ContainerStarted","Data":"d1fd01401afd5ee9f64b7cbac6352a7b382f6cf83071423b8f74a4d87103cac3"} Apr 22 17:54:02.786180 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.786162 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:54:02.821762 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821737 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-installation-pull-secrets\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.821873 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821777 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/512ab457-9185-4f8b-a45f-516b6ec97f63-ca-trust-extracted\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.821873 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821828 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-image-registry-private-configuration\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.821873 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821864 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-bound-sa-token\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.822028 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821896 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q6s6\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-kube-api-access-5q6s6\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.822028 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821930 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-trusted-ca\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.822028 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.821961 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-certificates\") pod \"512ab457-9185-4f8b-a45f-516b6ec97f63\" (UID: \"512ab457-9185-4f8b-a45f-516b6ec97f63\") " Apr 22 17:54:02.822175 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.822070 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512ab457-9185-4f8b-a45f-516b6ec97f63-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:54:02.823214 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.822326 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/512ab457-9185-4f8b-a45f-516b6ec97f63-ca-trust-extracted\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:02.823214 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.822340 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:02.823214 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.822948 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:02.824117 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.824085 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-kube-api-access-5q6s6" (OuterVolumeSpecName: "kube-api-access-5q6s6") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "kube-api-access-5q6s6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:54:02.824207 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.824132 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:54:02.824207 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.824175 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:54:02.824289 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.824204 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "512ab457-9185-4f8b-a45f-516b6ec97f63" (UID: "512ab457-9185-4f8b-a45f-516b6ec97f63"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:54:02.870390 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.870348 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" podStartSLOduration=1.870337213 podStartE2EDuration="1.870337213s" podCreationTimestamp="2026-04-22 17:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:54:02.869375255 +0000 UTC m=+62.928278397" watchObservedRunningTime="2026-04-22 17:54:02.870337213 +0000 UTC m=+62.929240355" Apr 22 17:54:02.923571 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.923549 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-trusted-ca\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:02.923571 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.923568 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-certificates\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:02.923701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.923578 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-installation-pull-secrets\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:02.923701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.923587 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/512ab457-9185-4f8b-a45f-516b6ec97f63-image-registry-private-configuration\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:02.923701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.923595 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-bound-sa-token\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:02.923701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:02.923604 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5q6s6\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-kube-api-access-5q6s6\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:03.057921 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.057857 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rb77d"] Apr 22 17:54:03.062275 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.062256 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:03.065261 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.065243 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4qkvv\"" Apr 22 17:54:03.065571 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.065556 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:54:03.079662 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.079640 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rb77d"] Apr 22 17:54:03.086901 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.086885 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:54:03.124574 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.124545 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8mt\" (UniqueName: \"kubernetes.io/projected/61b29203-84c4-4fd8-a19e-9a8647316762-kube-api-access-5p8mt\") pod \"downloads-6bcc868b7-rb77d\" (UID: \"61b29203-84c4-4fd8-a19e-9a8647316762\") " pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:03.204061 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.204039 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt"] Apr 22 17:54:03.206903 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.206890 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:03.213237 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.213220 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 17:54:03.213433 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.213417 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-c6bgk\"" Apr 22 17:54:03.225031 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.225006 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8mt\" (UniqueName: \"kubernetes.io/projected/61b29203-84c4-4fd8-a19e-9a8647316762-kube-api-access-5p8mt\") pod \"downloads-6bcc868b7-rb77d\" (UID: \"61b29203-84c4-4fd8-a19e-9a8647316762\") " pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:03.233601 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.233577 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt"] Apr 22 17:54:03.250933 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.250907 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8mt\" (UniqueName: \"kubernetes.io/projected/61b29203-84c4-4fd8-a19e-9a8647316762-kube-api-access-5p8mt\") pod \"downloads-6bcc868b7-rb77d\" (UID: \"61b29203-84c4-4fd8-a19e-9a8647316762\") " pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:03.325422 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.325360 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4ed46985-5f70-4cda-a833-af43d0d97e60-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7nmdt\" (UID: \"4ed46985-5f70-4cda-a833-af43d0d97e60\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:03.370618 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.370594 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:03.426763 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.426719 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4ed46985-5f70-4cda-a833-af43d0d97e60-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7nmdt\" (UID: \"4ed46985-5f70-4cda-a833-af43d0d97e60\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:03.428822 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.428798 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4ed46985-5f70-4cda-a833-af43d0d97e60-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7nmdt\" (UID: \"4ed46985-5f70-4cda-a833-af43d0d97e60\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:03.484797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.484772 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rb77d"] Apr 22 17:54:03.487644 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:03.487623 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b29203_84c4_4fd8_a19e_9a8647316762.slice/crio-3d93cc55080e955ea1a66207a9e9773a3520bf2f3569eb93ec1340ff17bd25d8 WatchSource:0}: Error finding container 3d93cc55080e955ea1a66207a9e9773a3520bf2f3569eb93ec1340ff17bd25d8: Status 404 returned error can't find the container with id 3d93cc55080e955ea1a66207a9e9773a3520bf2f3569eb93ec1340ff17bd25d8 Apr 22 17:54:03.515713 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.515692 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:03.635325 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.635289 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt"] Apr 22 17:54:03.637921 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:03.637889 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed46985_5f70_4cda_a833_af43d0d97e60.slice/crio-93720b2ee626bf1dcd3113a20957dbebe32ee276dfa8c8e94d0b35c2206516a3 WatchSource:0}: Error finding container 93720b2ee626bf1dcd3113a20957dbebe32ee276dfa8c8e94d0b35c2206516a3: Status 404 returned error can't find the container with id 93720b2ee626bf1dcd3113a20957dbebe32ee276dfa8c8e94d0b35c2206516a3 Apr 22 17:54:03.788484 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.788055 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" event={"ID":"4ed46985-5f70-4cda-a833-af43d0d97e60","Type":"ContainerStarted","Data":"93720b2ee626bf1dcd3113a20957dbebe32ee276dfa8c8e94d0b35c2206516a3"} Apr 22 17:54:03.789098 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.789078 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rb77d" event={"ID":"61b29203-84c4-4fd8-a19e-9a8647316762","Type":"ContainerStarted","Data":"3d93cc55080e955ea1a66207a9e9773a3520bf2f3569eb93ec1340ff17bd25d8"} Apr 22 17:54:03.789215 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.789080 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b55dbf8cb-pn2vz" Apr 22 17:54:03.789447 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.789430 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:03.840207 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.840150 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b55dbf8cb-pn2vz"] Apr 22 17:54:03.849772 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.849750 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7b55dbf8cb-pn2vz"] Apr 22 17:54:03.930411 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:03.930385 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/512ab457-9185-4f8b-a45f-516b6ec97f63-registry-tls\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:04.553271 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.553237 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512ab457-9185-4f8b-a45f-516b6ec97f63" path="/var/lib/kubelet/pods/512ab457-9185-4f8b-a45f-516b6ec97f63/volumes" Apr 22 17:54:04.736501 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.736427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:54:04.736501 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.736486 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:54:04.739701 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.739653 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3400960b-c044-44c8-b84c-550071e3f93e-cert\") pod \"ingress-canary-vzgzr\" (UID: \"3400960b-c044-44c8-b84c-550071e3f93e\") " pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:54:04.740404 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.740364 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a00fffd-ba82-45c7-b379-68e21fd2f1f1-metrics-tls\") pod \"dns-default-8825w\" (UID: \"3a00fffd-ba82-45c7-b379-68e21fd2f1f1\") " pod="openshift-dns/dns-default-8825w" Apr 22 17:54:04.989409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.989384 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dchxj\"" Apr 22 17:54:04.997797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.997773 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vzgzr" Apr 22 17:54:04.998326 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:04.998138 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rhqnx\"" Apr 22 17:54:05.006361 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.006081 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8825w" Apr 22 17:54:05.140177 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.140025 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8825w"] Apr 22 17:54:05.143974 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:05.143944 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a00fffd_ba82_45c7_b379_68e21fd2f1f1.slice/crio-82e69fefbc9b035dba0dc3964ce47438495919cc35754264517062e6527b0ee3 WatchSource:0}: Error finding container 82e69fefbc9b035dba0dc3964ce47438495919cc35754264517062e6527b0ee3: Status 404 returned error can't find the container with id 82e69fefbc9b035dba0dc3964ce47438495919cc35754264517062e6527b0ee3 Apr 22 17:54:05.156068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.156028 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vzgzr"] Apr 22 17:54:05.160118 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:05.160092 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3400960b_c044_44c8_b84c_550071e3f93e.slice/crio-a806ec12950bc262a4ec747bebd4fa0854ed6625436132fb6f9c5633c564a851 WatchSource:0}: Error finding container a806ec12950bc262a4ec747bebd4fa0854ed6625436132fb6f9c5633c564a851: Status 404 returned error can't find the container with id a806ec12950bc262a4ec747bebd4fa0854ed6625436132fb6f9c5633c564a851 Apr 22 17:54:05.800383 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.799312 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" event={"ID":"4ed46985-5f70-4cda-a833-af43d0d97e60","Type":"ContainerStarted","Data":"e39d2d0b575448b8388cb84b0c3a7113b66768911e307ee34edadb3b660145c9"} Apr 22 17:54:05.800383 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.800266 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:05.802101 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.801920 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8825w" event={"ID":"3a00fffd-ba82-45c7-b379-68e21fd2f1f1","Type":"ContainerStarted","Data":"82e69fefbc9b035dba0dc3964ce47438495919cc35754264517062e6527b0ee3"} Apr 22 17:54:05.805124 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.805083 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vzgzr" event={"ID":"3400960b-c044-44c8-b84c-550071e3f93e","Type":"ContainerStarted","Data":"a806ec12950bc262a4ec747bebd4fa0854ed6625436132fb6f9c5633c564a851"} Apr 22 17:54:05.807165 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.807135 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" Apr 22 17:54:05.816984 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:05.816433 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7nmdt" podStartSLOduration=1.555960699 podStartE2EDuration="2.816417813s" podCreationTimestamp="2026-04-22 17:54:03 +0000 UTC" firstStartedPulling="2026-04-22 17:54:03.639688336 +0000 UTC m=+63.698591456" lastFinishedPulling="2026-04-22 17:54:04.900145437 +0000 UTC m=+64.959048570" observedRunningTime="2026-04-22 17:54:05.814723817 +0000 UTC m=+65.873626958" watchObservedRunningTime="2026-04-22 17:54:05.816417813 +0000 UTC m=+65.875320990" Apr 22 17:54:06.193207 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.192011 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gn6mt"] Apr 22 17:54:06.195493 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.195466 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.198131 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.197961 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 17:54:06.198855 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.198288 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:54:06.198855 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.198498 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nk92n\"" Apr 22 17:54:06.198855 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.198692 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 17:54:06.200619 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.199895 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:54:06.200619 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.200085 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:54:06.208002 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.207921 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gn6mt"] Apr 22 17:54:06.249482 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.249296 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1bc1717-c469-4566-bd31-fca0ae08a007-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.249482 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.249369 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1bc1717-c469-4566-bd31-fca0ae08a007-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.249482 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.249434 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:54:06.249795 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.249486 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvft\" (UniqueName: \"kubernetes.io/projected/d1bc1717-c469-4566-bd31-fca0ae08a007-kube-api-access-dxvft\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.249795 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.249517 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1bc1717-c469-4566-bd31-fca0ae08a007-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.252258 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.252032 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:06.262832 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.262798 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1273b1fd-25f6-4315-a692-c599fb3e48b7-metrics-certs\") pod \"network-metrics-daemon-t6kpj\" (UID: \"1273b1fd-25f6-4315-a692-c599fb3e48b7\") " pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:54:06.351354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.350542 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvft\" (UniqueName: \"kubernetes.io/projected/d1bc1717-c469-4566-bd31-fca0ae08a007-kube-api-access-dxvft\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.351354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.350590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1bc1717-c469-4566-bd31-fca0ae08a007-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.351354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.350642 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1bc1717-c469-4566-bd31-fca0ae08a007-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.351354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.350695 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1bc1717-c469-4566-bd31-fca0ae08a007-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.351731 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.351426 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1bc1717-c469-4566-bd31-fca0ae08a007-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.354314 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.354289 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1bc1717-c469-4566-bd31-fca0ae08a007-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.354439 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.354404 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1bc1717-c469-4566-bd31-fca0ae08a007-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.359745 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.359724 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvft\" (UniqueName: \"kubernetes.io/projected/d1bc1717-c469-4566-bd31-fca0ae08a007-kube-api-access-dxvft\") pod \"prometheus-operator-5676c8c784-gn6mt\" (UID: \"d1bc1717-c469-4566-bd31-fca0ae08a007\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.365936 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.365910 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7sslk\"" Apr 22 17:54:06.374235 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.374208 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6kpj" Apr 22 17:54:06.451418 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.451354 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:54:06.454259 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.454234 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:06.465152 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.465131 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:06.475310 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.475268 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwsb\" (UniqueName: \"kubernetes.io/projected/3c596acd-7332-4aab-afbb-73b8773fb825-kube-api-access-8wwsb\") pod \"network-check-target-kbkmw\" (UID: \"3c596acd-7332-4aab-afbb-73b8773fb825\") " pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:54:06.511146 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.511123 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" Apr 22 17:54:06.661120 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.661085 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9cn8\"" Apr 22 17:54:06.669602 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:06.669397 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:54:07.920248 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:07.920221 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kbkmw"] Apr 22 17:54:07.923728 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:07.923654 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c596acd_7332_4aab_afbb_73b8773fb825.slice/crio-d9fa9fc38313e5b0c9a5eb10c208dbc10e9961e4786c05eba1e42371bf66267c WatchSource:0}: Error finding container d9fa9fc38313e5b0c9a5eb10c208dbc10e9961e4786c05eba1e42371bf66267c: Status 404 returned error can't find the container with id d9fa9fc38313e5b0c9a5eb10c208dbc10e9961e4786c05eba1e42371bf66267c Apr 22 17:54:08.141455 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.141403 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gn6mt"] Apr 22 17:54:08.144279 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.144219 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t6kpj"] Apr 22 17:54:08.149100 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:08.149059 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1273b1fd_25f6_4315_a692_c599fb3e48b7.slice/crio-b16a192816530517a9649b3f1a8b5b5d89bd291b9b42f775304539bab36eb286 WatchSource:0}: Error finding container b16a192816530517a9649b3f1a8b5b5d89bd291b9b42f775304539bab36eb286: Status 404 returned error can't find the container with id b16a192816530517a9649b3f1a8b5b5d89bd291b9b42f775304539bab36eb286 Apr 22 17:54:08.149416 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:08.149394 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1bc1717_c469_4566_bd31_fca0ae08a007.slice/crio-0973515ca4e346cfdbc6218f2739bec0f909325fad26858cf655e0c7cf7f7089 WatchSource:0}: Error finding container 0973515ca4e346cfdbc6218f2739bec0f909325fad26858cf655e0c7cf7f7089: Status 404 returned error can't find the container with id 0973515ca4e346cfdbc6218f2739bec0f909325fad26858cf655e0c7cf7f7089 Apr 22 17:54:08.678925 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.678895 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c49d8574b-lccn2"] Apr 22 17:54:08.698240 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.697271 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c49d8574b-lccn2"] Apr 22 17:54:08.698240 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.697401 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.702160 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.702122 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:54:08.702390 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.702367 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:54:08.702657 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.702633 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jtrhn\"" Apr 22 17:54:08.702951 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.702936 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:54:08.703171 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.703153 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:54:08.703999 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.703461 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:54:08.768627 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.768596 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-serving-cert\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.768785 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.768642 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-config\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.768785 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.768719 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-service-ca\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.768785 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.768751 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-oauth-serving-cert\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.768926 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.768836 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcsf\" (UniqueName: \"kubernetes.io/projected/62a92e11-84ca-495c-9e18-a2ab4073eb36-kube-api-access-htcsf\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.768926 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.768866 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-oauth-config\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.835087 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.835040 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8825w" event={"ID":"3a00fffd-ba82-45c7-b379-68e21fd2f1f1","Type":"ContainerStarted","Data":"755dc968e5c1d5c42c443e2281dd90f572976f6a917e7539292ba7bc89d4ce06"} Apr 22 17:54:08.835087 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.835080 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8825w" event={"ID":"3a00fffd-ba82-45c7-b379-68e21fd2f1f1","Type":"ContainerStarted","Data":"4ec5c335fb9892e9595b4d3354c2f7ca19f6c711d239add5276b5dd6c5c74193"} Apr 22 17:54:08.835540 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.835379 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8825w" Apr 22 17:54:08.837467 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.837409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vzgzr" event={"ID":"3400960b-c044-44c8-b84c-550071e3f93e","Type":"ContainerStarted","Data":"18daed846f2b60ade6e3f5cf3037e72a28225b6aaea56329bc510adcf1f0b302"} Apr 22 17:54:08.840627 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.840600 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t6kpj" event={"ID":"1273b1fd-25f6-4315-a692-c599fb3e48b7","Type":"ContainerStarted","Data":"b16a192816530517a9649b3f1a8b5b5d89bd291b9b42f775304539bab36eb286"} Apr 22 17:54:08.842084 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.842062 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kbkmw" event={"ID":"3c596acd-7332-4aab-afbb-73b8773fb825","Type":"ContainerStarted","Data":"d9fa9fc38313e5b0c9a5eb10c208dbc10e9961e4786c05eba1e42371bf66267c"} Apr 22 17:54:08.844516 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.844481 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" event={"ID":"d1bc1717-c469-4566-bd31-fca0ae08a007","Type":"ContainerStarted","Data":"0973515ca4e346cfdbc6218f2739bec0f909325fad26858cf655e0c7cf7f7089"} Apr 22 17:54:08.858388 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.858342 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8825w" podStartSLOduration=34.275235865 podStartE2EDuration="36.858326628s" podCreationTimestamp="2026-04-22 17:53:32 +0000 UTC" firstStartedPulling="2026-04-22 17:54:05.146555227 +0000 UTC m=+65.205458350" lastFinishedPulling="2026-04-22 17:54:07.729645979 +0000 UTC m=+67.788549113" observedRunningTime="2026-04-22 17:54:08.856698565 +0000 UTC m=+68.915601701" watchObservedRunningTime="2026-04-22 17:54:08.858326628 +0000 UTC m=+68.917229777" Apr 22 17:54:08.870043 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.870018 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htcsf\" (UniqueName: \"kubernetes.io/projected/62a92e11-84ca-495c-9e18-a2ab4073eb36-kube-api-access-htcsf\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.870175 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.870060 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-oauth-config\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.870175 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.870092 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-serving-cert\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.870175 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.870122 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-config\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.870328 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.870196 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-service-ca\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.870328 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.870224 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-oauth-serving-cert\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.871123 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.871099 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-config\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.871689 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.871654 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-service-ca\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.872190 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.872147 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-oauth-serving-cert\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.877233 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.877155 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-oauth-config\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.881556 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.881531 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcsf\" (UniqueName: \"kubernetes.io/projected/62a92e11-84ca-495c-9e18-a2ab4073eb36-kube-api-access-htcsf\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:08.881946 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.881779 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vzgzr" podStartSLOduration=34.309876454 podStartE2EDuration="36.881766766s" podCreationTimestamp="2026-04-22 17:53:32 +0000 UTC" firstStartedPulling="2026-04-22 17:54:05.162432368 +0000 UTC m=+65.221335487" lastFinishedPulling="2026-04-22 17:54:07.734322663 +0000 UTC m=+67.793225799" observedRunningTime="2026-04-22 17:54:08.881754294 +0000 UTC m=+68.940657438" watchObservedRunningTime="2026-04-22 17:54:08.881766766 +0000 UTC m=+68.940669909" Apr 22 17:54:08.882806 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:08.882783 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-serving-cert\") pod \"console-5c49d8574b-lccn2\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:09.011790 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:09.011465 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:09.175327 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:09.175292 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c49d8574b-lccn2"] Apr 22 17:54:09.182169 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:09.182128 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a92e11_84ca_495c_9e18_a2ab4073eb36.slice/crio-e71515bba644af0410a41ade2a1a1d623d920095882280eef1d7a024e61c1cb3 WatchSource:0}: Error finding container e71515bba644af0410a41ade2a1a1d623d920095882280eef1d7a024e61c1cb3: Status 404 returned error can't find the container with id e71515bba644af0410a41ade2a1a1d623d920095882280eef1d7a024e61c1cb3 Apr 22 17:54:09.850929 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:09.850890 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c49d8574b-lccn2" event={"ID":"62a92e11-84ca-495c-9e18-a2ab4073eb36","Type":"ContainerStarted","Data":"e71515bba644af0410a41ade2a1a1d623d920095882280eef1d7a024e61c1cb3"} Apr 22 17:54:10.857973 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:10.857885 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" event={"ID":"d1bc1717-c469-4566-bd31-fca0ae08a007","Type":"ContainerStarted","Data":"7bc48725aa9879420ae80ed6e712195fb7b78799d68dff89fe49d588943d7fd7"} Apr 22 17:54:10.857973 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:10.857933 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" event={"ID":"d1bc1717-c469-4566-bd31-fca0ae08a007","Type":"ContainerStarted","Data":"b01ac928eaa1bab4dc962e0244383ace202d11f276a8d06b87d5cb539f880725"} Apr 22 17:54:10.860995 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:10.860790 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t6kpj" event={"ID":"1273b1fd-25f6-4315-a692-c599fb3e48b7","Type":"ContainerStarted","Data":"c24a12ddcca1a8ece063fee7f863083d11a92e29b4fdd266ff1844a40664539e"} Apr 22 17:54:10.860995 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:10.860825 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t6kpj" event={"ID":"1273b1fd-25f6-4315-a692-c599fb3e48b7","Type":"ContainerStarted","Data":"dd2b671ea65cc06e2560d08e7df5805166a3c86d4d87dfb8c8e0c769eb1d3900"} Apr 22 17:54:10.878422 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:10.878364 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gn6mt" podStartSLOduration=2.8541240930000003 podStartE2EDuration="4.878343075s" podCreationTimestamp="2026-04-22 17:54:06 +0000 UTC" firstStartedPulling="2026-04-22 17:54:08.152476328 +0000 UTC m=+68.211379448" lastFinishedPulling="2026-04-22 17:54:10.176695307 +0000 UTC m=+70.235598430" observedRunningTime="2026-04-22 17:54:10.877368907 +0000 UTC m=+70.936272052" watchObservedRunningTime="2026-04-22 17:54:10.878343075 +0000 UTC m=+70.937246241" Apr 22 17:54:12.635742 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.635203 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-t6kpj" podStartSLOduration=70.611413161 podStartE2EDuration="1m12.635179096s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:54:08.151535916 +0000 UTC m=+68.210439038" lastFinishedPulling="2026-04-22 17:54:10.175301847 +0000 UTC m=+70.234204973" observedRunningTime="2026-04-22 17:54:10.896407852 +0000 UTC m=+70.955310994" watchObservedRunningTime="2026-04-22 17:54:12.635179096 +0000 UTC m=+72.694082239" Apr 22 17:54:12.635742 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.635649 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-g4nn6"] Apr 22 17:54:12.640130 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.640106 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.644047 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.643635 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zb7n5\"" Apr 22 17:54:12.646088 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.645892 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:54:12.650893 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.650612 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:54:12.650958 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.650942 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:54:12.709204 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709166 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-textfile\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709382 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709214 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-wtmp\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709382 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709283 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvkz\" (UniqueName: \"kubernetes.io/projected/886952ac-1b2d-4421-a069-8ec990d20254-kube-api-access-bfvkz\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709382 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709322 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-root\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709382 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-tls\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709589 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709399 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-accelerators-collector-config\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709589 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709461 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709589 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709500 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/886952ac-1b2d-4421-a069-8ec990d20254-metrics-client-ca\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.709589 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.709524 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-sys\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810499 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810456 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/886952ac-1b2d-4421-a069-8ec990d20254-metrics-client-ca\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810512 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-sys\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810575 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-textfile\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810605 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-wtmp\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvkz\" (UniqueName: \"kubernetes.io/projected/886952ac-1b2d-4421-a069-8ec990d20254-kube-api-access-bfvkz\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810692 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810690 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-root\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810955 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810716 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-tls\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810955 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810751 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-accelerators-collector-config\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.810955 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.810817 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.811520 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.811172 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/886952ac-1b2d-4421-a069-8ec990d20254-metrics-client-ca\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.811520 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.811204 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-sys\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.811520 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.811331 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-wtmp\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.811520 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.811383 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/886952ac-1b2d-4421-a069-8ec990d20254-root\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.811520 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.811442 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-textfile\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.812338 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.812296 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-accelerators-collector-config\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.813901 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.813858 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.814999 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.814944 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/886952ac-1b2d-4421-a069-8ec990d20254-node-exporter-tls\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.820396 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.820347 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvkz\" (UniqueName: \"kubernetes.io/projected/886952ac-1b2d-4421-a069-8ec990d20254-kube-api-access-bfvkz\") pod \"node-exporter-g4nn6\" (UID: \"886952ac-1b2d-4421-a069-8ec990d20254\") " pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:12.955230 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:12.955190 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-g4nn6" Apr 22 17:54:13.287854 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:13.287818 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886952ac_1b2d_4421_a069_8ec990d20254.slice/crio-76a9cd0ce3fd2d67c2d7f7062ae8358895509bb9d970304d9831ce6057d9f4b5 WatchSource:0}: Error finding container 76a9cd0ce3fd2d67c2d7f7062ae8358895509bb9d970304d9831ce6057d9f4b5: Status 404 returned error can't find the container with id 76a9cd0ce3fd2d67c2d7f7062ae8358895509bb9d970304d9831ce6057d9f4b5 Apr 22 17:54:13.871816 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:13.871775 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kbkmw" event={"ID":"3c596acd-7332-4aab-afbb-73b8773fb825","Type":"ContainerStarted","Data":"ca5aa7cd23fb5fecc14f20a4f64b71c8a49aa701486ff84419685adfcff6c6c3"} Apr 22 17:54:13.872299 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:13.871849 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:54:13.873948 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:13.873915 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c49d8574b-lccn2" event={"ID":"62a92e11-84ca-495c-9e18-a2ab4073eb36","Type":"ContainerStarted","Data":"cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82"} Apr 22 17:54:13.875493 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:13.875470 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g4nn6" event={"ID":"886952ac-1b2d-4421-a069-8ec990d20254","Type":"ContainerStarted","Data":"76a9cd0ce3fd2d67c2d7f7062ae8358895509bb9d970304d9831ce6057d9f4b5"} Apr 22 17:54:13.887969 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:13.887907 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kbkmw" podStartSLOduration=68.523850169 podStartE2EDuration="1m13.887892305s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:54:07.926609201 +0000 UTC m=+67.985512336" lastFinishedPulling="2026-04-22 17:54:13.290651344 +0000 UTC m=+73.349554472" observedRunningTime="2026-04-22 17:54:13.886687195 +0000 UTC m=+73.945590338" watchObservedRunningTime="2026-04-22 17:54:13.887892305 +0000 UTC m=+73.946795449" Apr 22 17:54:13.905641 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:13.905092 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c49d8574b-lccn2" podStartSLOduration=1.756336015 podStartE2EDuration="5.905076349s" podCreationTimestamp="2026-04-22 17:54:08 +0000 UTC" firstStartedPulling="2026-04-22 17:54:09.184525684 +0000 UTC m=+69.243428809" lastFinishedPulling="2026-04-22 17:54:13.333266011 +0000 UTC m=+73.392169143" observedRunningTime="2026-04-22 17:54:13.904789069 +0000 UTC m=+73.963692210" watchObservedRunningTime="2026-04-22 17:54:13.905076349 +0000 UTC m=+73.963979493" Apr 22 17:54:18.854792 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:18.853877 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8825w" Apr 22 17:54:19.013543 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.013502 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:19.013751 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.013558 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:19.018507 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.018480 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:19.472618 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.472580 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76796c8dbd-jj5s9"] Apr 22 17:54:19.478055 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.478035 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.486998 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.486170 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76796c8dbd-jj5s9"] Apr 22 17:54:19.486998 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.486609 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:54:19.569977 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.569943 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-config\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.569977 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.569982 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-trusted-ca-bundle\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.570185 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.570003 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-oauth-config\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.570185 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.570018 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-service-ca\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.570185 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.570087 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-serving-cert\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.570185 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.570144 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wp5l\" (UniqueName: \"kubernetes.io/projected/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-kube-api-access-6wp5l\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.570321 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.570274 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-oauth-serving-cert\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671316 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671280 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-oauth-serving-cert\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671494 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-config\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671494 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671389 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-trusted-ca-bundle\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671494 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671432 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-oauth-config\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671494 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671448 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-service-ca\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671494 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671485 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-serving-cert\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.671784 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.671524 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wp5l\" (UniqueName: \"kubernetes.io/projected/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-kube-api-access-6wp5l\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.672159 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.672131 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-config\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.672360 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.672311 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-trusted-ca-bundle\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.672475 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.672404 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-service-ca\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.672538 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.672472 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-oauth-serving-cert\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.674498 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.674475 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-serving-cert\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.674578 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.674528 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-oauth-config\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.680006 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.679982 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wp5l\" (UniqueName: \"kubernetes.io/projected/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-kube-api-access-6wp5l\") pod \"console-76796c8dbd-jj5s9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.789689 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.789595 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:19.897850 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:19.897818 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:21.162134 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.162106 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76796c8dbd-jj5s9"] Apr 22 17:54:21.165045 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:54:21.165016 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e6803c_9d24_4af6_ab54_1a1f7a9131f9.slice/crio-a26f03f3697b9783e5156f9f5ef2997c05380e050bc673398011a908e6c226f3 WatchSource:0}: Error finding container a26f03f3697b9783e5156f9f5ef2997c05380e050bc673398011a908e6c226f3: Status 404 returned error can't find the container with id a26f03f3697b9783e5156f9f5ef2997c05380e050bc673398011a908e6c226f3 Apr 22 17:54:21.902824 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.902784 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76796c8dbd-jj5s9" event={"ID":"32e6803c-9d24-4af6-ab54-1a1f7a9131f9","Type":"ContainerStarted","Data":"dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d"} Apr 22 17:54:21.902824 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.902823 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76796c8dbd-jj5s9" event={"ID":"32e6803c-9d24-4af6-ab54-1a1f7a9131f9","Type":"ContainerStarted","Data":"a26f03f3697b9783e5156f9f5ef2997c05380e050bc673398011a908e6c226f3"} Apr 22 17:54:21.904421 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.904396 2564 generic.go:358] "Generic (PLEG): container finished" podID="886952ac-1b2d-4421-a069-8ec990d20254" containerID="e273290d71d942ce6bc7c54001741cd8e8eeca0165dd083691c837b1221cbd48" exitCode=0 Apr 22 17:54:21.904544 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.904431 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g4nn6" event={"ID":"886952ac-1b2d-4421-a069-8ec990d20254","Type":"ContainerDied","Data":"e273290d71d942ce6bc7c54001741cd8e8eeca0165dd083691c837b1221cbd48"} Apr 22 17:54:21.905963 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.905939 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rb77d" event={"ID":"61b29203-84c4-4fd8-a19e-9a8647316762","Type":"ContainerStarted","Data":"93c1090dcb7c0f1c2a45faa00a3ce6982444f3ac86ca3e7b7b5c547edd053b57"} Apr 22 17:54:21.906170 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.906148 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:21.917493 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.917475 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rb77d" Apr 22 17:54:21.924116 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.924069 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76796c8dbd-jj5s9" podStartSLOduration=2.924052735 podStartE2EDuration="2.924052735s" podCreationTimestamp="2026-04-22 17:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:54:21.921784795 +0000 UTC m=+81.980687936" watchObservedRunningTime="2026-04-22 17:54:21.924052735 +0000 UTC m=+81.982955879" Apr 22 17:54:21.940646 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:21.940597 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rb77d" podStartSLOduration=1.270272012 podStartE2EDuration="18.940582667s" podCreationTimestamp="2026-04-22 17:54:03 +0000 UTC" firstStartedPulling="2026-04-22 17:54:03.489555838 +0000 UTC m=+63.548458959" lastFinishedPulling="2026-04-22 17:54:21.159866481 +0000 UTC m=+81.218769614" observedRunningTime="2026-04-22 17:54:21.939507033 +0000 UTC m=+81.998410176" watchObservedRunningTime="2026-04-22 17:54:21.940582667 +0000 UTC m=+81.999485810" Apr 22 17:54:22.250433 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:22.250394 2564 patch_prober.go:28] interesting pod/image-registry-6fbdf466c4-c4xwm container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:54:22.250911 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:22.250453 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" podUID="a31eed0e-7ff0-4e55-9e16-7fd5c607e632" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:54:22.911388 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:22.911357 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g4nn6" event={"ID":"886952ac-1b2d-4421-a069-8ec990d20254","Type":"ContainerStarted","Data":"336e600245060af63b1180c8d5cdcd7996db04105980b5393be243f52dfd1870"} Apr 22 17:54:22.911585 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:22.911558 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g4nn6" event={"ID":"886952ac-1b2d-4421-a069-8ec990d20254","Type":"ContainerStarted","Data":"5c905acf3534a4d42b9a0477c82e024b5e7ca67fe1728e2f62daeb34e723cfdc"} Apr 22 17:54:22.932075 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:22.932032 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-g4nn6" podStartSLOduration=10.21070119 podStartE2EDuration="10.932015623s" podCreationTimestamp="2026-04-22 17:54:12 +0000 UTC" firstStartedPulling="2026-04-22 17:54:13.289870858 +0000 UTC m=+73.348773980" lastFinishedPulling="2026-04-22 17:54:14.011185291 +0000 UTC m=+74.070088413" observedRunningTime="2026-04-22 17:54:22.93063274 +0000 UTC m=+82.989535877" watchObservedRunningTime="2026-04-22 17:54:22.932015623 +0000 UTC m=+82.990918768" Apr 22 17:54:24.796252 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:24.796220 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6fbdf466c4-c4xwm" Apr 22 17:54:29.790709 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:29.790649 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:29.790709 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:29.790703 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:29.796017 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:29.795992 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:29.936613 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:29.936583 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:54:29.982386 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:29.982351 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c49d8574b-lccn2"] Apr 22 17:54:44.882303 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:44.882267 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kbkmw" Apr 22 17:54:55.005810 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.005772 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c49d8574b-lccn2" podUID="62a92e11-84ca-495c-9e18-a2ab4073eb36" containerName="console" containerID="cri-o://cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82" gracePeriod=15 Apr 22 17:54:55.259375 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.259314 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c49d8574b-lccn2_62a92e11-84ca-495c-9e18-a2ab4073eb36/console/0.log" Apr 22 17:54:55.259476 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.259378 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:55.342279 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342245 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-oauth-serving-cert\") pod \"62a92e11-84ca-495c-9e18-a2ab4073eb36\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " Apr 22 17:54:55.342457 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342293 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htcsf\" (UniqueName: \"kubernetes.io/projected/62a92e11-84ca-495c-9e18-a2ab4073eb36-kube-api-access-htcsf\") pod \"62a92e11-84ca-495c-9e18-a2ab4073eb36\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " Apr 22 17:54:55.342457 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342312 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-oauth-config\") pod \"62a92e11-84ca-495c-9e18-a2ab4073eb36\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " Apr 22 17:54:55.342457 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342336 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-config\") pod \"62a92e11-84ca-495c-9e18-a2ab4073eb36\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " Apr 22 17:54:55.342457 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342384 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-serving-cert\") pod \"62a92e11-84ca-495c-9e18-a2ab4073eb36\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " Apr 22 17:54:55.342457 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342416 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-service-ca\") pod \"62a92e11-84ca-495c-9e18-a2ab4073eb36\" (UID: \"62a92e11-84ca-495c-9e18-a2ab4073eb36\") " Apr 22 17:54:55.342816 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342789 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "62a92e11-84ca-495c-9e18-a2ab4073eb36" (UID: "62a92e11-84ca-495c-9e18-a2ab4073eb36"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:55.342904 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342869 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-config" (OuterVolumeSpecName: "console-config") pod "62a92e11-84ca-495c-9e18-a2ab4073eb36" (UID: "62a92e11-84ca-495c-9e18-a2ab4073eb36"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:55.342904 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.342874 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-service-ca" (OuterVolumeSpecName: "service-ca") pod "62a92e11-84ca-495c-9e18-a2ab4073eb36" (UID: "62a92e11-84ca-495c-9e18-a2ab4073eb36"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:55.344602 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.344576 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a92e11-84ca-495c-9e18-a2ab4073eb36-kube-api-access-htcsf" (OuterVolumeSpecName: "kube-api-access-htcsf") pod "62a92e11-84ca-495c-9e18-a2ab4073eb36" (UID: "62a92e11-84ca-495c-9e18-a2ab4073eb36"). InnerVolumeSpecName "kube-api-access-htcsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:54:55.344700 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.344603 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "62a92e11-84ca-495c-9e18-a2ab4073eb36" (UID: "62a92e11-84ca-495c-9e18-a2ab4073eb36"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:54:55.344700 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.344617 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "62a92e11-84ca-495c-9e18-a2ab4073eb36" (UID: "62a92e11-84ca-495c-9e18-a2ab4073eb36"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:54:55.443077 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.443040 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:55.443077 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.443071 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-service-ca\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:55.443077 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.443082 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-oauth-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:55.443304 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.443091 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-htcsf\" (UniqueName: \"kubernetes.io/projected/62a92e11-84ca-495c-9e18-a2ab4073eb36-kube-api-access-htcsf\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:55.443304 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.443101 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-oauth-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:55.443304 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:55.443112 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62a92e11-84ca-495c-9e18-a2ab4073eb36-console-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:54:56.000302 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.000277 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c49d8574b-lccn2_62a92e11-84ca-495c-9e18-a2ab4073eb36/console/0.log" Apr 22 17:54:56.000553 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.000314 2564 generic.go:358] "Generic (PLEG): container finished" podID="62a92e11-84ca-495c-9e18-a2ab4073eb36" containerID="cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82" exitCode=2 Apr 22 17:54:56.000553 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.000362 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c49d8574b-lccn2" event={"ID":"62a92e11-84ca-495c-9e18-a2ab4073eb36","Type":"ContainerDied","Data":"cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82"} Apr 22 17:54:56.000553 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.000379 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c49d8574b-lccn2" Apr 22 17:54:56.000553 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.000406 2564 scope.go:117] "RemoveContainer" containerID="cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82" Apr 22 17:54:56.000553 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.000385 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c49d8574b-lccn2" event={"ID":"62a92e11-84ca-495c-9e18-a2ab4073eb36","Type":"ContainerDied","Data":"e71515bba644af0410a41ade2a1a1d623d920095882280eef1d7a024e61c1cb3"} Apr 22 17:54:56.011957 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.011861 2564 scope.go:117] "RemoveContainer" containerID="cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82" Apr 22 17:54:56.012184 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:54:56.012085 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82\": container with ID starting with cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82 not found: ID does not exist" containerID="cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82" Apr 22 17:54:56.012184 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.012108 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82"} err="failed to get container status \"cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82\": rpc error: code = NotFound desc = could not find container \"cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82\": container with ID starting with cbb2226868dbd4d6b4d9b86a3951fbe98bf5881b8bdbd065b3976aa8fc9a0b82 not found: ID does not exist" Apr 22 17:54:56.022839 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.022817 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c49d8574b-lccn2"] Apr 22 17:54:56.027154 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.027133 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c49d8574b-lccn2"] Apr 22 17:54:56.553116 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:54:56.553086 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a92e11-84ca-495c-9e18-a2ab4073eb36" path="/var/lib/kubelet/pods/62a92e11-84ca-495c-9e18-a2ab4073eb36/volumes" Apr 22 17:55:00.204919 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:00.204893 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vzgzr_3400960b-c044-44c8-b84c-550071e3f93e/serve-healthcheck-canary/0.log" Apr 22 17:55:19.864122 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.864087 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-546d7bd4c-db7vh"] Apr 22 17:55:19.865131 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.865098 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62a92e11-84ca-495c-9e18-a2ab4073eb36" containerName="console" Apr 22 17:55:19.865131 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.865125 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a92e11-84ca-495c-9e18-a2ab4073eb36" containerName="console" Apr 22 17:55:19.865277 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.865198 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="62a92e11-84ca-495c-9e18-a2ab4073eb36" containerName="console" Apr 22 17:55:19.869084 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.869068 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.875854 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.875820 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546d7bd4c-db7vh"] Apr 22 17:55:19.919819 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.919779 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-oauth-serving-cert\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.919819 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.919821 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-config\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.920076 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.919838 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jgp\" (UniqueName: \"kubernetes.io/projected/9e46bdae-2d72-4292-9cf1-aef182c7431e-kube-api-access-p4jgp\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.920076 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.919909 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-serving-cert\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.920076 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.919993 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-oauth-config\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.920076 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.920026 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-trusted-ca-bundle\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:19.920076 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:19.920053 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-service-ca\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.020942 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.020904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-oauth-serving-cert\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.020942 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.020941 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-config\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021183 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.020959 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jgp\" (UniqueName: \"kubernetes.io/projected/9e46bdae-2d72-4292-9cf1-aef182c7431e-kube-api-access-p4jgp\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021183 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.020986 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-serving-cert\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021183 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.021012 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-oauth-config\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021183 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.021031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-trusted-ca-bundle\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021183 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.021056 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-service-ca\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021745 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.021720 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-oauth-serving-cert\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021870 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.021833 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-config\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.021870 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.021849 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-service-ca\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.022106 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.022083 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-trusted-ca-bundle\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.023532 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.023511 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-oauth-config\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.023683 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.023650 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-serving-cert\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.033956 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.033933 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jgp\" (UniqueName: \"kubernetes.io/projected/9e46bdae-2d72-4292-9cf1-aef182c7431e-kube-api-access-p4jgp\") pod \"console-546d7bd4c-db7vh\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.179160 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.179112 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:20.306358 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:20.306333 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546d7bd4c-db7vh"] Apr 22 17:55:20.307960 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:55:20.307932 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e46bdae_2d72_4292_9cf1_aef182c7431e.slice/crio-7f62cee947512201ed001782520794ad98e43561aabf08d00fd21e4cf50655a8 WatchSource:0}: Error finding container 7f62cee947512201ed001782520794ad98e43561aabf08d00fd21e4cf50655a8: Status 404 returned error can't find the container with id 7f62cee947512201ed001782520794ad98e43561aabf08d00fd21e4cf50655a8 Apr 22 17:55:21.078338 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:21.078307 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546d7bd4c-db7vh" event={"ID":"9e46bdae-2d72-4292-9cf1-aef182c7431e","Type":"ContainerStarted","Data":"f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3"} Apr 22 17:55:21.078338 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:21.078344 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546d7bd4c-db7vh" event={"ID":"9e46bdae-2d72-4292-9cf1-aef182c7431e","Type":"ContainerStarted","Data":"7f62cee947512201ed001782520794ad98e43561aabf08d00fd21e4cf50655a8"} Apr 22 17:55:21.094541 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:21.094499 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-546d7bd4c-db7vh" podStartSLOduration=2.094485809 podStartE2EDuration="2.094485809s" podCreationTimestamp="2026-04-22 17:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:21.094235169 +0000 UTC m=+141.153138311" watchObservedRunningTime="2026-04-22 17:55:21.094485809 +0000 UTC m=+141.153388950" Apr 22 17:55:30.180187 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:30.180156 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:30.180187 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:30.180192 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:30.184803 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:30.184781 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:31.109326 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:31.109298 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:55:31.155341 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:31.155314 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76796c8dbd-jj5s9"] Apr 22 17:55:56.173400 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.173365 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76796c8dbd-jj5s9" podUID="32e6803c-9d24-4af6-ab54-1a1f7a9131f9" containerName="console" containerID="cri-o://dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d" gracePeriod=15 Apr 22 17:55:56.409883 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.409863 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76796c8dbd-jj5s9_32e6803c-9d24-4af6-ab54-1a1f7a9131f9/console/0.log" Apr 22 17:55:56.409982 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.409926 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:55:56.500823 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500743 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-oauth-serving-cert\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.500823 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500786 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-config\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.500823 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500817 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-trusted-ca-bundle\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.501068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500841 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-service-ca\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.501068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500881 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-serving-cert\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.501068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500966 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wp5l\" (UniqueName: \"kubernetes.io/projected/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-kube-api-access-6wp5l\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.501068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.500995 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-oauth-config\") pod \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\" (UID: \"32e6803c-9d24-4af6-ab54-1a1f7a9131f9\") " Apr 22 17:55:56.501265 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.501179 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:56.501265 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.501235 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-service-ca" (OuterVolumeSpecName: "service-ca") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:56.501344 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.501241 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-config" (OuterVolumeSpecName: "console-config") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:56.501344 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.501261 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:56.504259 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.504223 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-kube-api-access-6wp5l" (OuterVolumeSpecName: "kube-api-access-6wp5l") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "kube-api-access-6wp5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:56.506975 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.504767 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:56.507102 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.507036 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "32e6803c-9d24-4af6-ab54-1a1f7a9131f9" (UID: "32e6803c-9d24-4af6-ab54-1a1f7a9131f9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:56.602310 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602263 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:56.602310 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602304 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-trusted-ca-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:56.602310 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602315 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-service-ca\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:56.602310 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602323 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:56.602551 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602333 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6wp5l\" (UniqueName: \"kubernetes.io/projected/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-kube-api-access-6wp5l\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:56.602551 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602343 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-console-oauth-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:56.602551 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:56.602352 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e6803c-9d24-4af6-ab54-1a1f7a9131f9-oauth-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:55:57.172931 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.172905 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76796c8dbd-jj5s9_32e6803c-9d24-4af6-ab54-1a1f7a9131f9/console/0.log" Apr 22 17:55:57.173093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.172945 2564 generic.go:358] "Generic (PLEG): container finished" podID="32e6803c-9d24-4af6-ab54-1a1f7a9131f9" containerID="dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d" exitCode=2 Apr 22 17:55:57.173093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.172983 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76796c8dbd-jj5s9" event={"ID":"32e6803c-9d24-4af6-ab54-1a1f7a9131f9","Type":"ContainerDied","Data":"dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d"} Apr 22 17:55:57.173093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.173035 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76796c8dbd-jj5s9" Apr 22 17:55:57.173093 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.173050 2564 scope.go:117] "RemoveContainer" containerID="dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d" Apr 22 17:55:57.173292 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.173037 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76796c8dbd-jj5s9" event={"ID":"32e6803c-9d24-4af6-ab54-1a1f7a9131f9","Type":"ContainerDied","Data":"a26f03f3697b9783e5156f9f5ef2997c05380e050bc673398011a908e6c226f3"} Apr 22 17:55:57.180584 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.180546 2564 scope.go:117] "RemoveContainer" containerID="dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d" Apr 22 17:55:57.180989 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:55:57.180820 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d\": container with ID starting with dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d not found: ID does not exist" containerID="dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d" Apr 22 17:55:57.180989 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.180852 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d"} err="failed to get container status \"dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d\": rpc error: code = NotFound desc = could not find container \"dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d\": container with ID starting with dd534710acfb5cc3fc3525cc4ec764bd25728d589e937c5090425796fc89a43d not found: ID does not exist" Apr 22 17:55:57.189894 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.189873 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76796c8dbd-jj5s9"] Apr 22 17:55:57.193318 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:57.193300 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76796c8dbd-jj5s9"] Apr 22 17:55:58.553229 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:55:58.553193 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e6803c-9d24-4af6-ab54-1a1f7a9131f9" path="/var/lib/kubelet/pods/32e6803c-9d24-4af6-ab54-1a1f7a9131f9/volumes" Apr 22 17:56:41.879087 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.879056 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-798946cd94-gsdmx"] Apr 22 17:56:41.879502 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.879293 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32e6803c-9d24-4af6-ab54-1a1f7a9131f9" containerName="console" Apr 22 17:56:41.879502 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.879304 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e6803c-9d24-4af6-ab54-1a1f7a9131f9" containerName="console" Apr 22 17:56:41.879502 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.879346 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="32e6803c-9d24-4af6-ab54-1a1f7a9131f9" containerName="console" Apr 22 17:56:41.882016 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.881999 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.895049 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.895028 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-798946cd94-gsdmx"] Apr 22 17:56:41.904080 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904058 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-trusted-ca-bundle\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.904171 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904086 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-config\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.904171 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvh2j\" (UniqueName: \"kubernetes.io/projected/68e96de2-93ad-4e06-b521-b8fa82613ad1-kube-api-access-lvh2j\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.904171 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904124 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-oauth-config\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.904282 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904173 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-oauth-serving-cert\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.904282 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904194 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-serving-cert\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:41.904282 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:41.904250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-service-ca\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.004684 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004641 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-service-ca\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.004797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004722 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-trusted-ca-bundle\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.004797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004741 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-config\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.004797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004758 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvh2j\" (UniqueName: \"kubernetes.io/projected/68e96de2-93ad-4e06-b521-b8fa82613ad1-kube-api-access-lvh2j\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.004797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004774 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-oauth-config\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.004797 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004794 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-oauth-serving-cert\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.005058 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.004829 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-serving-cert\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.005401 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.005372 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-service-ca\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.005485 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.005429 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-oauth-serving-cert\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.005562 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.005537 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-config\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.005846 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.005825 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-trusted-ca-bundle\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.007301 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.007278 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-serving-cert\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.007396 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.007333 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-oauth-config\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.013220 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.013201 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvh2j\" (UniqueName: \"kubernetes.io/projected/68e96de2-93ad-4e06-b521-b8fa82613ad1-kube-api-access-lvh2j\") pod \"console-798946cd94-gsdmx\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.190757 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.190738 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:42.311204 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:42.311169 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-798946cd94-gsdmx"] Apr 22 17:56:42.314063 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:56:42.314035 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68e96de2_93ad_4e06_b521_b8fa82613ad1.slice/crio-19141aabeab76a39ea2b5df96c1595ec4ba0d99550cb6a26916468fc2c97b785 WatchSource:0}: Error finding container 19141aabeab76a39ea2b5df96c1595ec4ba0d99550cb6a26916468fc2c97b785: Status 404 returned error can't find the container with id 19141aabeab76a39ea2b5df96c1595ec4ba0d99550cb6a26916468fc2c97b785 Apr 22 17:56:43.291721 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:43.291682 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798946cd94-gsdmx" event={"ID":"68e96de2-93ad-4e06-b521-b8fa82613ad1","Type":"ContainerStarted","Data":"456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67"} Apr 22 17:56:43.291721 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:43.291726 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798946cd94-gsdmx" event={"ID":"68e96de2-93ad-4e06-b521-b8fa82613ad1","Type":"ContainerStarted","Data":"19141aabeab76a39ea2b5df96c1595ec4ba0d99550cb6a26916468fc2c97b785"} Apr 22 17:56:43.310044 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:43.309999 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-798946cd94-gsdmx" podStartSLOduration=2.3099859990000002 podStartE2EDuration="2.309985999s" podCreationTimestamp="2026-04-22 17:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:43.307915723 +0000 UTC m=+223.366818889" watchObservedRunningTime="2026-04-22 17:56:43.309985999 +0000 UTC m=+223.368889140" Apr 22 17:56:52.191739 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:52.191696 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:52.191739 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:52.191744 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:52.196327 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:52.196304 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:52.318897 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:52.318869 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 17:56:52.366720 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:56:52.366693 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546d7bd4c-db7vh"] Apr 22 17:57:17.385272 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.385198 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-546d7bd4c-db7vh" podUID="9e46bdae-2d72-4292-9cf1-aef182c7431e" containerName="console" containerID="cri-o://f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3" gracePeriod=15 Apr 22 17:57:17.620053 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.620030 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546d7bd4c-db7vh_9e46bdae-2d72-4292-9cf1-aef182c7431e/console/0.log" Apr 22 17:57:17.620175 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.620099 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:57:17.757618 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.757594 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-oauth-config\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.757771 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.757633 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-trusted-ca-bundle\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.757771 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.757655 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-service-ca\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.757771 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.757708 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-serving-cert\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.757771 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.757746 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-oauth-serving-cert\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.757944 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.757849 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4jgp\" (UniqueName: \"kubernetes.io/projected/9e46bdae-2d72-4292-9cf1-aef182c7431e-kube-api-access-p4jgp\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.758424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758310 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:17.758424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758338 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-config\") pod \"9e46bdae-2d72-4292-9cf1-aef182c7431e\" (UID: \"9e46bdae-2d72-4292-9cf1-aef182c7431e\") " Apr 22 17:57:17.758424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758372 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:17.758424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758403 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-service-ca" (OuterVolumeSpecName: "service-ca") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:17.760271 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758801 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-config" (OuterVolumeSpecName: "console-config") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:17.760271 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758864 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-service-ca\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:17.760271 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758889 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-oauth-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:17.760271 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.758912 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-trusted-ca-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:17.764409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.764381 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:17.764409 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.764391 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e46bdae-2d72-4292-9cf1-aef182c7431e-kube-api-access-p4jgp" (OuterVolumeSpecName: "kube-api-access-p4jgp") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "kube-api-access-p4jgp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:17.764532 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.764408 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9e46bdae-2d72-4292-9cf1-aef182c7431e" (UID: "9e46bdae-2d72-4292-9cf1-aef182c7431e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:17.859544 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.859523 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-oauth-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:17.859544 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.859544 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:17.859658 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.859554 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4jgp\" (UniqueName: \"kubernetes.io/projected/9e46bdae-2d72-4292-9cf1-aef182c7431e-kube-api-access-p4jgp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:17.859658 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:17.859564 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e46bdae-2d72-4292-9cf1-aef182c7431e-console-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:57:18.381770 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.381744 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546d7bd4c-db7vh_9e46bdae-2d72-4292-9cf1-aef182c7431e/console/0.log" Apr 22 17:57:18.381959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.381783 2564 generic.go:358] "Generic (PLEG): container finished" podID="9e46bdae-2d72-4292-9cf1-aef182c7431e" containerID="f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3" exitCode=2 Apr 22 17:57:18.381959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.381821 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546d7bd4c-db7vh" event={"ID":"9e46bdae-2d72-4292-9cf1-aef182c7431e","Type":"ContainerDied","Data":"f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3"} Apr 22 17:57:18.381959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.381869 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546d7bd4c-db7vh" event={"ID":"9e46bdae-2d72-4292-9cf1-aef182c7431e","Type":"ContainerDied","Data":"7f62cee947512201ed001782520794ad98e43561aabf08d00fd21e4cf50655a8"} Apr 22 17:57:18.381959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.381883 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546d7bd4c-db7vh" Apr 22 17:57:18.381959 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.381890 2564 scope.go:117] "RemoveContainer" containerID="f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3" Apr 22 17:57:18.389400 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.389028 2564 scope.go:117] "RemoveContainer" containerID="f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3" Apr 22 17:57:18.389400 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:57:18.389344 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3\": container with ID starting with f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3 not found: ID does not exist" containerID="f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3" Apr 22 17:57:18.389400 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.389367 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3"} err="failed to get container status \"f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3\": rpc error: code = NotFound desc = could not find container \"f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3\": container with ID starting with f5e0da52a1e31d54eafd9fbc6bcaa809b1844c64ddbcb1cd72695eb764f40cb3 not found: ID does not exist" Apr 22 17:57:18.401575 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.401552 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546d7bd4c-db7vh"] Apr 22 17:57:18.405168 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.405147 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-546d7bd4c-db7vh"] Apr 22 17:57:18.552775 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:57:18.552751 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e46bdae-2d72-4292-9cf1-aef182c7431e" path="/var/lib/kubelet/pods/9e46bdae-2d72-4292-9cf1-aef182c7431e/volumes" Apr 22 17:58:00.448944 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:00.448916 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:58:00.449479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:00.449266 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 17:58:00.451696 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:00.451661 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:58:10.427730 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.427698 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x"] Apr 22 17:58:10.430096 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.427945 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e46bdae-2d72-4292-9cf1-aef182c7431e" containerName="console" Apr 22 17:58:10.430096 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.427955 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e46bdae-2d72-4292-9cf1-aef182c7431e" containerName="console" Apr 22 17:58:10.430096 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.428005 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e46bdae-2d72-4292-9cf1-aef182c7431e" containerName="console" Apr 22 17:58:10.431001 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.430987 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.433431 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.433411 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:58:10.433431 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.433411 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fkw8k\"" Apr 22 17:58:10.434380 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.434364 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:58:10.438697 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.438654 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x"] Apr 22 17:58:10.621949 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.621916 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64j7\" (UniqueName: \"kubernetes.io/projected/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-kube-api-access-c64j7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.622100 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.621993 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.622100 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.622020 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.722512 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.722456 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c64j7\" (UniqueName: \"kubernetes.io/projected/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-kube-api-access-c64j7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.722512 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.722494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.722614 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.722513 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.722869 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.722852 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.722914 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.722886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.730691 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.730650 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64j7\" (UniqueName: \"kubernetes.io/projected/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-kube-api-access-c64j7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.740447 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.740431 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:10.853496 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.853450 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x"] Apr 22 17:58:10.856041 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:58:10.856016 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0ee96ae_4f15_4e20_b27a_3b108bd8d3dc.slice/crio-2c2216e36ea0834ca4f21d116b7eadc6c0cfaf95e931106021413e91875d974a WatchSource:0}: Error finding container 2c2216e36ea0834ca4f21d116b7eadc6c0cfaf95e931106021413e91875d974a: Status 404 returned error can't find the container with id 2c2216e36ea0834ca4f21d116b7eadc6c0cfaf95e931106021413e91875d974a Apr 22 17:58:10.857905 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:10.857890 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:58:11.516522 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:11.516485 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" event={"ID":"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc","Type":"ContainerStarted","Data":"2c2216e36ea0834ca4f21d116b7eadc6c0cfaf95e931106021413e91875d974a"} Apr 22 17:58:16.535128 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:16.535098 2564 generic.go:358] "Generic (PLEG): container finished" podID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerID="4bfb46bf6a38f57ececc13933eb37ed053b45d1b59a681f0fd43d925704ebf7a" exitCode=0 Apr 22 17:58:16.535587 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:16.535178 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" event={"ID":"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc","Type":"ContainerDied","Data":"4bfb46bf6a38f57ececc13933eb37ed053b45d1b59a681f0fd43d925704ebf7a"} Apr 22 17:58:18.542512 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:18.542438 2564 generic.go:358] "Generic (PLEG): container finished" podID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerID="4c55ac2990aafead39a7d0d8b258aa6693e5ef5d01dd84c1843a4dbea16aca14" exitCode=0 Apr 22 17:58:18.542512 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:18.542496 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" event={"ID":"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc","Type":"ContainerDied","Data":"4c55ac2990aafead39a7d0d8b258aa6693e5ef5d01dd84c1843a4dbea16aca14"} Apr 22 17:58:24.564575 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:24.564540 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" event={"ID":"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc","Type":"ContainerStarted","Data":"429bbe43df771a0f9a95992e2050f196b4b1af5e6edbd69418c868b9d54c0912"} Apr 22 17:58:24.587475 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:24.587421 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" podStartSLOduration=0.994212044 podStartE2EDuration="14.587405887s" podCreationTimestamp="2026-04-22 17:58:10 +0000 UTC" firstStartedPulling="2026-04-22 17:58:10.858009449 +0000 UTC m=+310.916912569" lastFinishedPulling="2026-04-22 17:58:24.451203284 +0000 UTC m=+324.510106412" observedRunningTime="2026-04-22 17:58:24.585184999 +0000 UTC m=+324.644088142" watchObservedRunningTime="2026-04-22 17:58:24.587405887 +0000 UTC m=+324.646309029" Apr 22 17:58:25.568775 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:25.568741 2564 generic.go:358] "Generic (PLEG): container finished" podID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerID="429bbe43df771a0f9a95992e2050f196b4b1af5e6edbd69418c868b9d54c0912" exitCode=0 Apr 22 17:58:25.569144 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:25.568810 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" event={"ID":"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc","Type":"ContainerDied","Data":"429bbe43df771a0f9a95992e2050f196b4b1af5e6edbd69418c868b9d54c0912"} Apr 22 17:58:26.686028 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.686007 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:26.743300 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.743275 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-util\") pod \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " Apr 22 17:58:26.743426 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.743306 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c64j7\" (UniqueName: \"kubernetes.io/projected/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-kube-api-access-c64j7\") pod \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " Apr 22 17:58:26.743426 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.743338 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-bundle\") pod \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\" (UID: \"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc\") " Apr 22 17:58:26.743918 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.743893 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-bundle" (OuterVolumeSpecName: "bundle") pod "d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" (UID: "d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:58:26.745340 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.745317 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-kube-api-access-c64j7" (OuterVolumeSpecName: "kube-api-access-c64j7") pod "d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" (UID: "d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc"). InnerVolumeSpecName "kube-api-access-c64j7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:58:26.748479 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.748429 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-util" (OuterVolumeSpecName: "util") pod "d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" (UID: "d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:58:26.844561 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.844507 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:58:26.844561 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.844526 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:58:26.844561 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:26.844536 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c64j7\" (UniqueName: \"kubernetes.io/projected/d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc-kube-api-access-c64j7\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:58:27.575451 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:27.575420 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" event={"ID":"d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc","Type":"ContainerDied","Data":"2c2216e36ea0834ca4f21d116b7eadc6c0cfaf95e931106021413e91875d974a"} Apr 22 17:58:27.575451 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:27.575451 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2216e36ea0834ca4f21d116b7eadc6c0cfaf95e931106021413e91875d974a" Apr 22 17:58:27.575451 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:27.575452 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctxw2x" Apr 22 17:58:32.081414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081382 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x"] Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081658 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="extract" Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081681 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="extract" Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081691 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="pull" Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081698 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="pull" Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081707 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="util" Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081712 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="util" Apr 22 17:58:32.081809 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.081754 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0ee96ae-4f15-4e20-b27a-3b108bd8d3dc" containerName="extract" Apr 22 17:58:32.128994 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.128966 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x"] Apr 22 17:58:32.129134 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.129057 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.131611 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.131589 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 17:58:32.131728 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.131716 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 17:58:32.131896 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.131880 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 17:58:32.131961 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.131914 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-hsttw\"" Apr 22 17:58:32.179931 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.179908 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bm7l\" (UniqueName: \"kubernetes.io/projected/5b456bcb-20b1-467e-98b2-2cc1b28ff0a9-kube-api-access-9bm7l\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x\" (UID: \"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.180018 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.179956 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/5b456bcb-20b1-467e-98b2-2cc1b28ff0a9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x\" (UID: \"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.280821 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.280797 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bm7l\" (UniqueName: \"kubernetes.io/projected/5b456bcb-20b1-467e-98b2-2cc1b28ff0a9-kube-api-access-9bm7l\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x\" (UID: \"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.280988 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.280840 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/5b456bcb-20b1-467e-98b2-2cc1b28ff0a9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x\" (UID: \"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.283078 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.283049 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/5b456bcb-20b1-467e-98b2-2cc1b28ff0a9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x\" (UID: \"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.293969 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.293948 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bm7l\" (UniqueName: \"kubernetes.io/projected/5b456bcb-20b1-467e-98b2-2cc1b28ff0a9-kube-api-access-9bm7l\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x\" (UID: \"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.438607 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.438583 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:32.558421 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.558394 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x"] Apr 22 17:58:32.560589 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:58:32.560562 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b456bcb_20b1_467e_98b2_2cc1b28ff0a9.slice/crio-ccc63fcd7720b637b9182eb2d5363c2788fa278ab5aa5a2d2e2f95fa2ea119e5 WatchSource:0}: Error finding container ccc63fcd7720b637b9182eb2d5363c2788fa278ab5aa5a2d2e2f95fa2ea119e5: Status 404 returned error can't find the container with id ccc63fcd7720b637b9182eb2d5363c2788fa278ab5aa5a2d2e2f95fa2ea119e5 Apr 22 17:58:32.587926 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:32.587901 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" event={"ID":"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9","Type":"ContainerStarted","Data":"ccc63fcd7720b637b9182eb2d5363c2788fa278ab5aa5a2d2e2f95fa2ea119e5"} Apr 22 17:58:37.336077 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.336050 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-mfnnc"] Apr 22 17:58:37.360103 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.360072 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-mfnnc"] Apr 22 17:58:37.360249 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.360116 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.362424 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.362391 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 17:58:37.362546 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.362391 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 17:58:37.362546 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.362489 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-pwq45\"" Apr 22 17:58:37.421097 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.421066 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b415c8f9-0d69-4159-8276-8499c2616ced-cabundle0\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.421217 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.421118 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.421217 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.421191 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlh9\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-kube-api-access-swlh9\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.521582 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.521545 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b415c8f9-0d69-4159-8276-8499c2616ced-cabundle0\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.521758 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.521590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.521758 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.521611 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swlh9\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-kube-api-access-swlh9\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.521758 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.521724 2564 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 17:58:37.521758 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.521747 2564 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:58:37.521758 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.521757 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:58:37.521990 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.521790 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mfnnc: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 17:58:37.521990 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.521854 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates podName:b415c8f9-0d69-4159-8276-8499c2616ced nodeName:}" failed. No retries permitted until 2026-04-22 17:58:38.021833432 +0000 UTC m=+338.080736556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates") pod "keda-operator-ffbb595cb-mfnnc" (UID: "b415c8f9-0d69-4159-8276-8499c2616ced") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 17:58:37.522245 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.522228 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b415c8f9-0d69-4159-8276-8499c2616ced-cabundle0\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.532519 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.532494 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlh9\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-kube-api-access-swlh9\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:37.600956 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.600885 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb"] Apr 22 17:58:37.632975 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.632946 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" event={"ID":"5b456bcb-20b1-467e-98b2-2cc1b28ff0a9","Type":"ContainerStarted","Data":"2c247451c9842e3a02a0abdde31abf7ffc962ae9385abbaeb835b0c4b1824eae"} Apr 22 17:58:37.632975 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.632976 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb"] Apr 22 17:58:37.633194 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.633058 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.633194 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.633159 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:58:37.635432 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.635410 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 17:58:37.653533 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.653475 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" podStartSLOduration=1.447559267 podStartE2EDuration="5.653462779s" podCreationTimestamp="2026-04-22 17:58:32 +0000 UTC" firstStartedPulling="2026-04-22 17:58:32.56279048 +0000 UTC m=+332.621693600" lastFinishedPulling="2026-04-22 17:58:36.768693992 +0000 UTC m=+336.827597112" observedRunningTime="2026-04-22 17:58:37.652503633 +0000 UTC m=+337.711406778" watchObservedRunningTime="2026-04-22 17:58:37.653462779 +0000 UTC m=+337.712365924" Apr 22 17:58:37.724337 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.724312 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.724442 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.724366 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvgz\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-kube-api-access-8zvgz\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.724442 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.724411 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f97b4557-9923-4ed3-a61c-419b48f0a237-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.802136 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.802107 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-sgx6l"] Apr 22 17:58:37.825796 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.825775 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-sgx6l"] Apr 22 17:58:37.825930 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.825874 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.825930 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.825909 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvgz\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-kube-api-access-8zvgz\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.826038 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.825935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f97b4557-9923-4ed3-a61c-419b48f0a237-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.826038 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.825951 2564 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:58:37.826038 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.825967 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:58:37.826038 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.825986 2564 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 17:58:37.826038 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.826007 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 17:58:37.826276 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.825879 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:37.826276 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:37.826060 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates podName:f97b4557-9923-4ed3-a61c-419b48f0a237 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:38.32604458 +0000 UTC m=+338.384947704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates") pod "keda-metrics-apiserver-7c9f485588-56bkb" (UID: "f97b4557-9923-4ed3-a61c-419b48f0a237") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 17:58:37.826389 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.826276 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f97b4557-9923-4ed3-a61c-419b48f0a237-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.828417 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.828399 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 17:58:37.834747 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.834725 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvgz\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-kube-api-access-8zvgz\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:37.926760 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.926740 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8542a673-38e8-4e19-98f6-2fa1fa63cbdb-certificates\") pod \"keda-admission-cf49989db-sgx6l\" (UID: \"8542a673-38e8-4e19-98f6-2fa1fa63cbdb\") " pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:37.926870 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:37.926805 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcpvr\" (UniqueName: \"kubernetes.io/projected/8542a673-38e8-4e19-98f6-2fa1fa63cbdb-kube-api-access-qcpvr\") pod \"keda-admission-cf49989db-sgx6l\" (UID: \"8542a673-38e8-4e19-98f6-2fa1fa63cbdb\") " pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:38.028266 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.027692 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8542a673-38e8-4e19-98f6-2fa1fa63cbdb-certificates\") pod \"keda-admission-cf49989db-sgx6l\" (UID: \"8542a673-38e8-4e19-98f6-2fa1fa63cbdb\") " pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:38.028266 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.027781 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:38.028266 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.027834 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcpvr\" (UniqueName: \"kubernetes.io/projected/8542a673-38e8-4e19-98f6-2fa1fa63cbdb-kube-api-access-qcpvr\") pod \"keda-admission-cf49989db-sgx6l\" (UID: \"8542a673-38e8-4e19-98f6-2fa1fa63cbdb\") " pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:38.028563 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.028375 2564 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:58:38.028563 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.028398 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:58:38.028563 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.028409 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mfnnc: references non-existent secret key: ca.crt Apr 22 17:58:38.028563 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.028470 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates podName:b415c8f9-0d69-4159-8276-8499c2616ced nodeName:}" failed. No retries permitted until 2026-04-22 17:58:39.028450655 +0000 UTC m=+339.087353780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates") pod "keda-operator-ffbb595cb-mfnnc" (UID: "b415c8f9-0d69-4159-8276-8499c2616ced") : references non-existent secret key: ca.crt Apr 22 17:58:38.031153 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.031132 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8542a673-38e8-4e19-98f6-2fa1fa63cbdb-certificates\") pod \"keda-admission-cf49989db-sgx6l\" (UID: \"8542a673-38e8-4e19-98f6-2fa1fa63cbdb\") " pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:38.036864 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.036837 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcpvr\" (UniqueName: \"kubernetes.io/projected/8542a673-38e8-4e19-98f6-2fa1fa63cbdb-kube-api-access-qcpvr\") pod \"keda-admission-cf49989db-sgx6l\" (UID: \"8542a673-38e8-4e19-98f6-2fa1fa63cbdb\") " pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:38.137924 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.137882 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:38.277627 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.277593 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-sgx6l"] Apr 22 17:58:38.281398 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:58:38.281374 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8542a673_38e8_4e19_98f6_2fa1fa63cbdb.slice/crio-c87f0f33e558bc9b4ae99da0d0640151746d576fa1f0cf6b22f9ad2ef914a61f WatchSource:0}: Error finding container c87f0f33e558bc9b4ae99da0d0640151746d576fa1f0cf6b22f9ad2ef914a61f: Status 404 returned error can't find the container with id c87f0f33e558bc9b4ae99da0d0640151746d576fa1f0cf6b22f9ad2ef914a61f Apr 22 17:58:38.330720 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.330689 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:38.330864 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.330765 2564 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:58:38.330864 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.330783 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:58:38.330864 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.330799 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb: references non-existent secret key: tls.crt Apr 22 17:58:38.330864 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:38.330854 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates podName:f97b4557-9923-4ed3-a61c-419b48f0a237 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:39.330834373 +0000 UTC m=+339.389737495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates") pod "keda-metrics-apiserver-7c9f485588-56bkb" (UID: "f97b4557-9923-4ed3-a61c-419b48f0a237") : references non-existent secret key: tls.crt Apr 22 17:58:38.606716 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:38.606638 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-sgx6l" event={"ID":"8542a673-38e8-4e19-98f6-2fa1fa63cbdb","Type":"ContainerStarted","Data":"c87f0f33e558bc9b4ae99da0d0640151746d576fa1f0cf6b22f9ad2ef914a61f"} Apr 22 17:58:39.036101 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:39.036065 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:39.036291 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.036196 2564 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:58:39.036291 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.036209 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:58:39.036291 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.036219 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mfnnc: references non-existent secret key: ca.crt Apr 22 17:58:39.036291 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.036286 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates podName:b415c8f9-0d69-4159-8276-8499c2616ced nodeName:}" failed. No retries permitted until 2026-04-22 17:58:41.036258418 +0000 UTC m=+341.095161542 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates") pod "keda-operator-ffbb595cb-mfnnc" (UID: "b415c8f9-0d69-4159-8276-8499c2616ced") : references non-existent secret key: ca.crt Apr 22 17:58:39.338890 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:39.338801 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:39.339053 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.338913 2564 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:58:39.339053 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.338924 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:58:39.339053 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.338941 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb: references non-existent secret key: tls.crt Apr 22 17:58:39.339053 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:39.338985 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates podName:f97b4557-9923-4ed3-a61c-419b48f0a237 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:41.338972096 +0000 UTC m=+341.397875217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates") pod "keda-metrics-apiserver-7c9f485588-56bkb" (UID: "f97b4557-9923-4ed3-a61c-419b48f0a237") : references non-existent secret key: tls.crt Apr 22 17:58:40.614492 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:40.614414 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-sgx6l" event={"ID":"8542a673-38e8-4e19-98f6-2fa1fa63cbdb","Type":"ContainerStarted","Data":"fe4e319e5772a38c6a8bcd71b26c8b0424b02a1be23fc807da6446d8e64c8fc9"} Apr 22 17:58:40.614845 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:40.614573 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:58:40.630103 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:40.630063 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-sgx6l" podStartSLOduration=1.628664341 podStartE2EDuration="3.630049558s" podCreationTimestamp="2026-04-22 17:58:37 +0000 UTC" firstStartedPulling="2026-04-22 17:58:38.282749234 +0000 UTC m=+338.341652365" lastFinishedPulling="2026-04-22 17:58:40.284134447 +0000 UTC m=+340.343037582" observedRunningTime="2026-04-22 17:58:40.628909493 +0000 UTC m=+340.687812657" watchObservedRunningTime="2026-04-22 17:58:40.630049558 +0000 UTC m=+340.688952700" Apr 22 17:58:41.053829 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:41.053796 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:41.054021 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:41.053953 2564 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:58:41.054021 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:41.053974 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:58:41.054021 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:41.053985 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mfnnc: references non-existent secret key: ca.crt Apr 22 17:58:41.054179 ip-10-0-128-219 kubenswrapper[2564]: E0422 17:58:41.054050 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates podName:b415c8f9-0d69-4159-8276-8499c2616ced nodeName:}" failed. No retries permitted until 2026-04-22 17:58:45.054031203 +0000 UTC m=+345.112934343 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates") pod "keda-operator-ffbb595cb-mfnnc" (UID: "b415c8f9-0d69-4159-8276-8499c2616ced") : references non-existent secret key: ca.crt Apr 22 17:58:41.355849 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:41.355759 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:41.358300 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:41.358272 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f97b4557-9923-4ed3-a61c-419b48f0a237-certificates\") pod \"keda-metrics-apiserver-7c9f485588-56bkb\" (UID: \"f97b4557-9923-4ed3-a61c-419b48f0a237\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:41.546171 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:41.546137 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:41.664582 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:41.664555 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb"] Apr 22 17:58:41.668073 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:58:41.668049 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf97b4557_9923_4ed3_a61c_419b48f0a237.slice/crio-524702569c79e9e813bbfcf8d199747505e402c73caed40dfdb3fc28f734971f WatchSource:0}: Error finding container 524702569c79e9e813bbfcf8d199747505e402c73caed40dfdb3fc28f734971f: Status 404 returned error can't find the container with id 524702569c79e9e813bbfcf8d199747505e402c73caed40dfdb3fc28f734971f Apr 22 17:58:42.622195 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:42.622159 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" event={"ID":"f97b4557-9923-4ed3-a61c-419b48f0a237","Type":"ContainerStarted","Data":"524702569c79e9e813bbfcf8d199747505e402c73caed40dfdb3fc28f734971f"} Apr 22 17:58:44.629262 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:44.629232 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" event={"ID":"f97b4557-9923-4ed3-a61c-419b48f0a237","Type":"ContainerStarted","Data":"8b0ceeb5592027c58d4016ec6eeeb1fcdfbba67c7654a2def7a8633021bd4ab1"} Apr 22 17:58:44.629652 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:44.629370 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:44.655464 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:44.655421 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" podStartSLOduration=5.219586707 podStartE2EDuration="7.655406973s" podCreationTimestamp="2026-04-22 17:58:37 +0000 UTC" firstStartedPulling="2026-04-22 17:58:41.669808633 +0000 UTC m=+341.728711752" lastFinishedPulling="2026-04-22 17:58:44.105628884 +0000 UTC m=+344.164532018" observedRunningTime="2026-04-22 17:58:44.654333172 +0000 UTC m=+344.713236324" watchObservedRunningTime="2026-04-22 17:58:44.655406973 +0000 UTC m=+344.714310114" Apr 22 17:58:45.083730 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:45.083696 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:45.086006 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:45.085986 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b415c8f9-0d69-4159-8276-8499c2616ced-certificates\") pod \"keda-operator-ffbb595cb-mfnnc\" (UID: \"b415c8f9-0d69-4159-8276-8499c2616ced\") " pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:45.171932 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:45.171900 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:45.289723 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:45.289692 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-mfnnc"] Apr 22 17:58:45.291636 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:58:45.291606 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb415c8f9_0d69_4159_8276_8499c2616ced.slice/crio-1502109530def407a2d2515cb76f529450e4d28ea32caa9cf04e3f32e8ac5475 WatchSource:0}: Error finding container 1502109530def407a2d2515cb76f529450e4d28ea32caa9cf04e3f32e8ac5475: Status 404 returned error can't find the container with id 1502109530def407a2d2515cb76f529450e4d28ea32caa9cf04e3f32e8ac5475 Apr 22 17:58:45.632817 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:45.632785 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" event={"ID":"b415c8f9-0d69-4159-8276-8499c2616ced","Type":"ContainerStarted","Data":"1502109530def407a2d2515cb76f529450e4d28ea32caa9cf04e3f32e8ac5475"} Apr 22 17:58:48.643960 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:48.643918 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" event={"ID":"b415c8f9-0d69-4159-8276-8499c2616ced","Type":"ContainerStarted","Data":"7812d35717bfc24b8744f64b833df9656865b397f70689840d749bdd8c9d66f7"} Apr 22 17:58:48.644397 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:48.644041 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:58:48.660678 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:48.660630 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" podStartSLOduration=8.494416096 podStartE2EDuration="11.660616515s" podCreationTimestamp="2026-04-22 17:58:37 +0000 UTC" firstStartedPulling="2026-04-22 17:58:45.292917541 +0000 UTC m=+345.351820660" lastFinishedPulling="2026-04-22 17:58:48.459117959 +0000 UTC m=+348.518021079" observedRunningTime="2026-04-22 17:58:48.659293932 +0000 UTC m=+348.718197073" watchObservedRunningTime="2026-04-22 17:58:48.660616515 +0000 UTC m=+348.719519657" Apr 22 17:58:55.637492 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:55.637467 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-56bkb" Apr 22 17:58:58.608764 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:58:58.608734 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4lk5x" Apr 22 17:59:01.620074 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:01.620039 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-sgx6l" Apr 22 17:59:09.649275 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:09.649247 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-mfnnc" Apr 22 17:59:30.287210 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.287172 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7"] Apr 22 17:59:30.295593 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.295571 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.298373 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.298348 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fkw8k\"" Apr 22 17:59:30.298503 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.298361 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:59:30.298576 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.298530 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:59:30.299528 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.299504 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7"] Apr 22 17:59:30.321891 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.321864 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbzss\" (UniqueName: \"kubernetes.io/projected/d13fe8f3-a4ae-4374-9c7a-282fd598c025-kube-api-access-fbzss\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.322007 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.321919 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.322007 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.321956 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.422582 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.422545 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.422582 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.422582 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.422787 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.422633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbzss\" (UniqueName: \"kubernetes.io/projected/d13fe8f3-a4ae-4374-9c7a-282fd598c025-kube-api-access-fbzss\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.422945 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.422925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.423011 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.422992 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.432350 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.432332 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbzss\" (UniqueName: \"kubernetes.io/projected/d13fe8f3-a4ae-4374-9c7a-282fd598c025-kube-api-access-fbzss\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.605401 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.605320 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:30.722601 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.722578 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7"] Apr 22 17:59:30.724869 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:59:30.724843 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13fe8f3_a4ae_4374_9c7a_282fd598c025.slice/crio-3b7cef4a02643c402898dbb7019c99bf8832fe537c375d469692e30beb606984 WatchSource:0}: Error finding container 3b7cef4a02643c402898dbb7019c99bf8832fe537c375d469692e30beb606984: Status 404 returned error can't find the container with id 3b7cef4a02643c402898dbb7019c99bf8832fe537c375d469692e30beb606984 Apr 22 17:59:30.766239 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:30.766210 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" event={"ID":"d13fe8f3-a4ae-4374-9c7a-282fd598c025","Type":"ContainerStarted","Data":"3b7cef4a02643c402898dbb7019c99bf8832fe537c375d469692e30beb606984"} Apr 22 17:59:31.770131 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:31.770053 2564 generic.go:358] "Generic (PLEG): container finished" podID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerID="a64cbc087098e9aebfa394bd9e9c291899c22f62be220bd74046f0fbb66e517b" exitCode=0 Apr 22 17:59:31.770131 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:31.770108 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" event={"ID":"d13fe8f3-a4ae-4374-9c7a-282fd598c025","Type":"ContainerDied","Data":"a64cbc087098e9aebfa394bd9e9c291899c22f62be220bd74046f0fbb66e517b"} Apr 22 17:59:36.788448 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:36.788364 2564 generic.go:358] "Generic (PLEG): container finished" podID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerID="f9e99e94ba2e79bf53e90d38955b2df39613f22130e4a5d625fc1774db48ceeb" exitCode=0 Apr 22 17:59:36.788448 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:36.788417 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" event={"ID":"d13fe8f3-a4ae-4374-9c7a-282fd598c025","Type":"ContainerDied","Data":"f9e99e94ba2e79bf53e90d38955b2df39613f22130e4a5d625fc1774db48ceeb"} Apr 22 17:59:37.793527 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:37.793489 2564 generic.go:358] "Generic (PLEG): container finished" podID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerID="8506efce89185a242442db8268e7f2b922950697e0d776972ea62fbafdbafde7" exitCode=0 Apr 22 17:59:37.793527 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:37.793528 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" event={"ID":"d13fe8f3-a4ae-4374-9c7a-282fd598c025","Type":"ContainerDied","Data":"8506efce89185a242442db8268e7f2b922950697e0d776972ea62fbafdbafde7"} Apr 22 17:59:38.917108 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.917083 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:38.988553 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.988512 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-bundle\") pod \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " Apr 22 17:59:38.988735 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.988565 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbzss\" (UniqueName: \"kubernetes.io/projected/d13fe8f3-a4ae-4374-9c7a-282fd598c025-kube-api-access-fbzss\") pod \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " Apr 22 17:59:38.988735 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.988647 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-util\") pod \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\" (UID: \"d13fe8f3-a4ae-4374-9c7a-282fd598c025\") " Apr 22 17:59:38.989373 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.989343 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-bundle" (OuterVolumeSpecName: "bundle") pod "d13fe8f3-a4ae-4374-9c7a-282fd598c025" (UID: "d13fe8f3-a4ae-4374-9c7a-282fd598c025"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:59:38.990724 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.990698 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13fe8f3-a4ae-4374-9c7a-282fd598c025-kube-api-access-fbzss" (OuterVolumeSpecName: "kube-api-access-fbzss") pod "d13fe8f3-a4ae-4374-9c7a-282fd598c025" (UID: "d13fe8f3-a4ae-4374-9c7a-282fd598c025"). InnerVolumeSpecName "kube-api-access-fbzss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:59:38.994906 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:38.994876 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-util" (OuterVolumeSpecName: "util") pod "d13fe8f3-a4ae-4374-9c7a-282fd598c025" (UID: "d13fe8f3-a4ae-4374-9c7a-282fd598c025"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:59:39.089774 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:39.089646 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:59:39.089774 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:39.089714 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d13fe8f3-a4ae-4374-9c7a-282fd598c025-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:59:39.089774 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:39.089730 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbzss\" (UniqueName: \"kubernetes.io/projected/d13fe8f3-a4ae-4374-9c7a-282fd598c025-kube-api-access-fbzss\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:59:39.800942 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:39.800899 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" event={"ID":"d13fe8f3-a4ae-4374-9c7a-282fd598c025","Type":"ContainerDied","Data":"3b7cef4a02643c402898dbb7019c99bf8832fe537c375d469692e30beb606984"} Apr 22 17:59:39.800942 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:39.800943 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7cef4a02643c402898dbb7019c99bf8832fe537c375d469692e30beb606984" Apr 22 17:59:39.801152 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:39.800952 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcckw7" Apr 22 17:59:43.142995 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.142963 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx"] Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143224 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="extract" Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143233 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="extract" Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143244 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="pull" Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143249 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="pull" Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143266 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="util" Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143271 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="util" Apr 22 17:59:43.143354 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.143316 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d13fe8f3-a4ae-4374-9c7a-282fd598c025" containerName="extract" Apr 22 17:59:43.146132 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.146114 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.149035 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.148987 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 17:59:43.149137 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.148990 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:59:43.149137 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.149102 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-v42hs\"" Apr 22 17:59:43.158414 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.158393 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx"] Apr 22 17:59:43.215864 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.215833 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9a5-544f-402c-a72b-30cef1020d88-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-t6sdx\" (UID: \"e5eae9a5-544f-402c-a72b-30cef1020d88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.216017 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.215907 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74hz\" (UniqueName: \"kubernetes.io/projected/e5eae9a5-544f-402c-a72b-30cef1020d88-kube-api-access-j74hz\") pod \"cert-manager-operator-controller-manager-54b9655956-t6sdx\" (UID: \"e5eae9a5-544f-402c-a72b-30cef1020d88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.316772 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.316740 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9a5-544f-402c-a72b-30cef1020d88-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-t6sdx\" (UID: \"e5eae9a5-544f-402c-a72b-30cef1020d88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.316904 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.316782 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j74hz\" (UniqueName: \"kubernetes.io/projected/e5eae9a5-544f-402c-a72b-30cef1020d88-kube-api-access-j74hz\") pod \"cert-manager-operator-controller-manager-54b9655956-t6sdx\" (UID: \"e5eae9a5-544f-402c-a72b-30cef1020d88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.317108 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.317090 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9a5-544f-402c-a72b-30cef1020d88-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-t6sdx\" (UID: \"e5eae9a5-544f-402c-a72b-30cef1020d88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.332237 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.332211 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74hz\" (UniqueName: \"kubernetes.io/projected/e5eae9a5-544f-402c-a72b-30cef1020d88-kube-api-access-j74hz\") pod \"cert-manager-operator-controller-manager-54b9655956-t6sdx\" (UID: \"e5eae9a5-544f-402c-a72b-30cef1020d88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.454550 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.454529 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" Apr 22 17:59:43.582060 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.582037 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx"] Apr 22 17:59:43.584743 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:59:43.584706 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5eae9a5_544f_402c_a72b_30cef1020d88.slice/crio-2e7609281ca6184819c31ede895a1776245ebb82c5a07c0ee5d3cdca45a38148 WatchSource:0}: Error finding container 2e7609281ca6184819c31ede895a1776245ebb82c5a07c0ee5d3cdca45a38148: Status 404 returned error can't find the container with id 2e7609281ca6184819c31ede895a1776245ebb82c5a07c0ee5d3cdca45a38148 Apr 22 17:59:43.814002 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:43.813932 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" event={"ID":"e5eae9a5-544f-402c-a72b-30cef1020d88","Type":"ContainerStarted","Data":"2e7609281ca6184819c31ede895a1776245ebb82c5a07c0ee5d3cdca45a38148"} Apr 22 17:59:45.821300 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:45.821265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" event={"ID":"e5eae9a5-544f-402c-a72b-30cef1020d88","Type":"ContainerStarted","Data":"882d4c5f49ed9340f770a9bfa3743b862f3c7dee24ddfc805fd84bbab64005b7"} Apr 22 17:59:45.869199 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:45.869153 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-t6sdx" podStartSLOduration=1.5236070000000002 podStartE2EDuration="2.869139856s" podCreationTimestamp="2026-04-22 17:59:43 +0000 UTC" firstStartedPulling="2026-04-22 17:59:43.587165564 +0000 UTC m=+403.646068684" lastFinishedPulling="2026-04-22 17:59:44.932698409 +0000 UTC m=+404.991601540" observedRunningTime="2026-04-22 17:59:45.867957104 +0000 UTC m=+405.926860247" watchObservedRunningTime="2026-04-22 17:59:45.869139856 +0000 UTC m=+405.928042997" Apr 22 17:59:52.590598 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.590570 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24"] Apr 22 17:59:52.593947 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.593930 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.596325 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.596300 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:59:52.596462 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.596417 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:59:52.597228 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.597212 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fkw8k\"" Apr 22 17:59:52.604394 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.604368 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24"] Apr 22 17:59:52.686295 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.686265 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v77w7\" (UniqueName: \"kubernetes.io/projected/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-kube-api-access-v77w7\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.686434 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.686318 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.686434 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.686366 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.787516 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.787477 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v77w7\" (UniqueName: \"kubernetes.io/projected/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-kube-api-access-v77w7\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.787687 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.787537 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.787687 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.787572 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.788020 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.787998 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.788052 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.788000 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.796035 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.796018 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v77w7\" (UniqueName: \"kubernetes.io/projected/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-kube-api-access-v77w7\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:52.903720 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:52.903616 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:53.024068 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:53.024041 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24"] Apr 22 17:59:53.025942 ip-10-0-128-219 kubenswrapper[2564]: W0422 17:59:53.025915 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e2f6ec_8068_40a6_8dd9_381ffbb78ac2.slice/crio-d788e8f4c82ee32091d6bfc922baae5bc96d48fbcc02bff31bde0052caaeaae0 WatchSource:0}: Error finding container d788e8f4c82ee32091d6bfc922baae5bc96d48fbcc02bff31bde0052caaeaae0: Status 404 returned error can't find the container with id d788e8f4c82ee32091d6bfc922baae5bc96d48fbcc02bff31bde0052caaeaae0 Apr 22 17:59:53.850577 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:53.850546 2564 generic.go:358] "Generic (PLEG): container finished" podID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerID="941178e06cefe69e1a34f551685170a37569bddf0bdf0fd136fe9539c9479242" exitCode=0 Apr 22 17:59:53.850924 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:53.850587 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" event={"ID":"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2","Type":"ContainerDied","Data":"941178e06cefe69e1a34f551685170a37569bddf0bdf0fd136fe9539c9479242"} Apr 22 17:59:53.850924 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:53.850614 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" event={"ID":"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2","Type":"ContainerStarted","Data":"d788e8f4c82ee32091d6bfc922baae5bc96d48fbcc02bff31bde0052caaeaae0"} Apr 22 17:59:56.862344 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:56.862311 2564 generic.go:358] "Generic (PLEG): container finished" podID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerID="287ae6fbdd21ba0814ea9068c08b55cbd3492ce9880e3c8fc9f358045557835c" exitCode=0 Apr 22 17:59:56.862761 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:56.862368 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" event={"ID":"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2","Type":"ContainerDied","Data":"287ae6fbdd21ba0814ea9068c08b55cbd3492ce9880e3c8fc9f358045557835c"} Apr 22 17:59:57.867946 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:57.867914 2564 generic.go:358] "Generic (PLEG): container finished" podID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerID="4a9fd9246a8e0b78c9deaea7c4948ad83b8f4896731c76182149c3d5267619fd" exitCode=0 Apr 22 17:59:57.868318 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:57.867994 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" event={"ID":"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2","Type":"ContainerDied","Data":"4a9fd9246a8e0b78c9deaea7c4948ad83b8f4896731c76182149c3d5267619fd"} Apr 22 17:59:58.990149 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:58.990125 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 17:59:59.139931 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.139857 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v77w7\" (UniqueName: \"kubernetes.io/projected/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-kube-api-access-v77w7\") pod \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " Apr 22 17:59:59.139931 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.139909 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-bundle\") pod \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " Apr 22 17:59:59.140132 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.139979 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-util\") pod \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\" (UID: \"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2\") " Apr 22 17:59:59.140323 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.140290 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-bundle" (OuterVolumeSpecName: "bundle") pod "24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" (UID: "24e2f6ec-8068-40a6-8dd9-381ffbb78ac2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:59:59.141868 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.141845 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-kube-api-access-v77w7" (OuterVolumeSpecName: "kube-api-access-v77w7") pod "24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" (UID: "24e2f6ec-8068-40a6-8dd9-381ffbb78ac2"). InnerVolumeSpecName "kube-api-access-v77w7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:59:59.144238 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.144217 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-util" (OuterVolumeSpecName: "util") pod "24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" (UID: "24e2f6ec-8068-40a6-8dd9-381ffbb78ac2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:59:59.240803 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.240775 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:59:59.240803 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.240801 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v77w7\" (UniqueName: \"kubernetes.io/projected/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-kube-api-access-v77w7\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:59:59.240962 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.240811 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e2f6ec-8068-40a6-8dd9-381ffbb78ac2-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 17:59:59.876109 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.876070 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" event={"ID":"24e2f6ec-8068-40a6-8dd9-381ffbb78ac2","Type":"ContainerDied","Data":"d788e8f4c82ee32091d6bfc922baae5bc96d48fbcc02bff31bde0052caaeaae0"} Apr 22 17:59:59.876109 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.876107 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d788e8f4c82ee32091d6bfc922baae5bc96d48fbcc02bff31bde0052caaeaae0" Apr 22 17:59:59.876306 ip-10-0-128-219 kubenswrapper[2564]: I0422 17:59:59.876117 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffsk24" Apr 22 18:00:21.258233 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258157 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55b9d5cdc8-6nn7g"] Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258481 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="util" Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258494 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="util" Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258507 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="pull" Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258513 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="pull" Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258526 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="extract" Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258532 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="extract" Apr 22 18:00:21.258688 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.258575 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="24e2f6ec-8068-40a6-8dd9-381ffbb78ac2" containerName="extract" Apr 22 18:00:21.261344 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.261328 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.269500 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.269473 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b9d5cdc8-6nn7g"] Apr 22 18:00:21.401845 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.401809 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-service-ca\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.402012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.401854 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-oauth-serving-cert\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.402012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.401919 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-serving-cert\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.402012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.401945 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-config\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.402012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.401964 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68fw9\" (UniqueName: \"kubernetes.io/projected/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-kube-api-access-68fw9\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.402012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.402007 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-oauth-config\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.402191 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.402038 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-trusted-ca-bundle\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.502991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.502951 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-oauth-config\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.502991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.502995 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-trusted-ca-bundle\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.503251 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.503024 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-service-ca\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.503251 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.503048 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-oauth-serving-cert\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.503251 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.503077 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-serving-cert\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.503251 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.503094 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-config\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.503251 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.503114 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68fw9\" (UniqueName: \"kubernetes.io/projected/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-kube-api-access-68fw9\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.504047 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.504019 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-service-ca\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.504164 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.504019 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-oauth-serving-cert\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.504164 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.504023 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-config\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.504286 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.504164 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-trusted-ca-bundle\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.505589 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.505560 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-serving-cert\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.505698 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.505591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-console-oauth-config\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.511738 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.511680 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68fw9\" (UniqueName: \"kubernetes.io/projected/157343db-7d66-43bf-b8d8-3d8e74b5c1fc-kube-api-access-68fw9\") pod \"console-55b9d5cdc8-6nn7g\" (UID: \"157343db-7d66-43bf-b8d8-3d8e74b5c1fc\") " pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.571411 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.571363 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:21.694794 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.694754 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b9d5cdc8-6nn7g"] Apr 22 18:00:21.699696 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:00:21.699650 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157343db_7d66_43bf_b8d8_3d8e74b5c1fc.slice/crio-b6c0a5816ed9a5c1f5a8b24c3f7bf881b9ce7aa3216a6a98e1614f14e2510421 WatchSource:0}: Error finding container b6c0a5816ed9a5c1f5a8b24c3f7bf881b9ce7aa3216a6a98e1614f14e2510421: Status 404 returned error can't find the container with id b6c0a5816ed9a5c1f5a8b24c3f7bf881b9ce7aa3216a6a98e1614f14e2510421 Apr 22 18:00:21.945407 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.945377 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b9d5cdc8-6nn7g" event={"ID":"157343db-7d66-43bf-b8d8-3d8e74b5c1fc","Type":"ContainerStarted","Data":"65eb3564c4f836172a36cb517654cf4b028bf12e3f9d4c7765f6bd8294c4cf9a"} Apr 22 18:00:21.945573 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.945413 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b9d5cdc8-6nn7g" event={"ID":"157343db-7d66-43bf-b8d8-3d8e74b5c1fc","Type":"ContainerStarted","Data":"b6c0a5816ed9a5c1f5a8b24c3f7bf881b9ce7aa3216a6a98e1614f14e2510421"} Apr 22 18:00:21.962877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:21.962819 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55b9d5cdc8-6nn7g" podStartSLOduration=0.962800841 podStartE2EDuration="962.800841ms" podCreationTimestamp="2026-04-22 18:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:00:21.961499761 +0000 UTC m=+442.020402918" watchObservedRunningTime="2026-04-22 18:00:21.962800841 +0000 UTC m=+442.021703984" Apr 22 18:00:31.572216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:31.572175 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:31.572216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:31.572218 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:31.577016 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:31.576992 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:31.983436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:31.983408 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55b9d5cdc8-6nn7g" Apr 22 18:00:32.037685 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:32.037628 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-798946cd94-gsdmx"] Apr 22 18:00:57.061190 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.061150 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-798946cd94-gsdmx" podUID="68e96de2-93ad-4e06-b521-b8fa82613ad1" containerName="console" containerID="cri-o://456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67" gracePeriod=15 Apr 22 18:00:57.288443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.288423 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-798946cd94-gsdmx_68e96de2-93ad-4e06-b521-b8fa82613ad1/console/0.log" Apr 22 18:00:57.288546 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.288479 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 18:00:57.390133 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390060 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-trusted-ca-bundle\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390133 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390091 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-service-ca\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390133 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390115 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-config\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390133 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390134 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-oauth-serving-cert\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390447 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390160 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-serving-cert\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390447 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390210 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvh2j\" (UniqueName: \"kubernetes.io/projected/68e96de2-93ad-4e06-b521-b8fa82613ad1-kube-api-access-lvh2j\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390447 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390237 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-oauth-config\") pod \"68e96de2-93ad-4e06-b521-b8fa82613ad1\" (UID: \"68e96de2-93ad-4e06-b521-b8fa82613ad1\") " Apr 22 18:00:57.390609 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390520 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-service-ca" (OuterVolumeSpecName: "service-ca") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:57.390609 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390533 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:57.390609 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390541 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-config" (OuterVolumeSpecName: "console-config") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:57.390609 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.390580 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:57.392355 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.392324 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e96de2-93ad-4e06-b521-b8fa82613ad1-kube-api-access-lvh2j" (OuterVolumeSpecName: "kube-api-access-lvh2j") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "kube-api-access-lvh2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:00:57.392355 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.392340 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:00:57.392536 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.392373 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "68e96de2-93ad-4e06-b521-b8fa82613ad1" (UID: "68e96de2-93ad-4e06-b521-b8fa82613ad1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:00:57.491707 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491664 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvh2j\" (UniqueName: \"kubernetes.io/projected/68e96de2-93ad-4e06-b521-b8fa82613ad1-kube-api-access-lvh2j\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.491707 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491705 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-oauth-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.491849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491715 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-trusted-ca-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.491849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491725 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-service-ca\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.491849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491735 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-config\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.491849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491744 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68e96de2-93ad-4e06-b521-b8fa82613ad1-oauth-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.491849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:57.491753 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68e96de2-93ad-4e06-b521-b8fa82613ad1-console-serving-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:00:58.060408 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.060383 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-798946cd94-gsdmx_68e96de2-93ad-4e06-b521-b8fa82613ad1/console/0.log" Apr 22 18:00:58.060548 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.060419 2564 generic.go:358] "Generic (PLEG): container finished" podID="68e96de2-93ad-4e06-b521-b8fa82613ad1" containerID="456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67" exitCode=2 Apr 22 18:00:58.060548 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.060451 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798946cd94-gsdmx" event={"ID":"68e96de2-93ad-4e06-b521-b8fa82613ad1","Type":"ContainerDied","Data":"456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67"} Apr 22 18:00:58.060548 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.060490 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798946cd94-gsdmx" Apr 22 18:00:58.060548 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.060504 2564 scope.go:117] "RemoveContainer" containerID="456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67" Apr 22 18:00:58.060752 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.060491 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798946cd94-gsdmx" event={"ID":"68e96de2-93ad-4e06-b521-b8fa82613ad1","Type":"ContainerDied","Data":"19141aabeab76a39ea2b5df96c1595ec4ba0d99550cb6a26916468fc2c97b785"} Apr 22 18:00:58.070842 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.070633 2564 scope.go:117] "RemoveContainer" containerID="456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67" Apr 22 18:00:58.071073 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:00:58.070927 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67\": container with ID starting with 456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67 not found: ID does not exist" containerID="456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67" Apr 22 18:00:58.071073 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.070953 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67"} err="failed to get container status \"456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67\": rpc error: code = NotFound desc = could not find container \"456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67\": container with ID starting with 456f3a62f8a95bfbfed844926edf0a40511fb2b4a41c1d53702a9f3eb7d3dc67 not found: ID does not exist" Apr 22 18:00:58.081980 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.081957 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-798946cd94-gsdmx"] Apr 22 18:00:58.085597 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.085576 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-798946cd94-gsdmx"] Apr 22 18:00:58.553462 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:00:58.553427 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e96de2-93ad-4e06-b521-b8fa82613ad1" path="/var/lib/kubelet/pods/68e96de2-93ad-4e06-b521-b8fa82613ad1/volumes" Apr 22 18:01:31.960182 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.960147 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4"] Apr 22 18:01:31.960628 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.960445 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68e96de2-93ad-4e06-b521-b8fa82613ad1" containerName="console" Apr 22 18:01:31.960628 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.960455 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e96de2-93ad-4e06-b521-b8fa82613ad1" containerName="console" Apr 22 18:01:31.960628 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.960514 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="68e96de2-93ad-4e06-b521-b8fa82613ad1" containerName="console" Apr 22 18:01:31.963571 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.963555 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:31.966041 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.966014 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:01:31.966172 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.966056 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fkw8k\"" Apr 22 18:01:31.967089 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.967072 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:01:31.973800 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:31.973782 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4"] Apr 22 18:01:32.036051 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.036023 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t52l\" (UniqueName: \"kubernetes.io/projected/feccf819-1f73-4905-98a9-0ee9212a368c-kube-api-access-6t52l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.036150 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.036097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.036150 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.036120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.136401 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.136371 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.136529 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.136404 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.136529 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.136445 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t52l\" (UniqueName: \"kubernetes.io/projected/feccf819-1f73-4905-98a9-0ee9212a368c-kube-api-access-6t52l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.136766 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.136746 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.136838 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.136786 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.144707 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.144689 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t52l\" (UniqueName: \"kubernetes.io/projected/feccf819-1f73-4905-98a9-0ee9212a368c-kube-api-access-6t52l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.273260 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.273201 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:32.389539 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:32.389504 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4"] Apr 22 18:01:32.392367 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:01:32.392341 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeccf819_1f73_4905_98a9_0ee9212a368c.slice/crio-a1e3ca28127ea52e56c1b968c11e8522bce60b43177260ee8112dbbf94263d16 WatchSource:0}: Error finding container a1e3ca28127ea52e56c1b968c11e8522bce60b43177260ee8112dbbf94263d16: Status 404 returned error can't find the container with id a1e3ca28127ea52e56c1b968c11e8522bce60b43177260ee8112dbbf94263d16 Apr 22 18:01:33.170000 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:33.169961 2564 generic.go:358] "Generic (PLEG): container finished" podID="feccf819-1f73-4905-98a9-0ee9212a368c" containerID="0794307e837c31fb6ca71d757ffa95d893fc2c91d8b557fb7386bc905cf4a651" exitCode=0 Apr 22 18:01:33.170315 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:33.170024 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" event={"ID":"feccf819-1f73-4905-98a9-0ee9212a368c","Type":"ContainerDied","Data":"0794307e837c31fb6ca71d757ffa95d893fc2c91d8b557fb7386bc905cf4a651"} Apr 22 18:01:33.170315 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:33.170061 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" event={"ID":"feccf819-1f73-4905-98a9-0ee9212a368c","Type":"ContainerStarted","Data":"a1e3ca28127ea52e56c1b968c11e8522bce60b43177260ee8112dbbf94263d16"} Apr 22 18:01:34.173769 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:34.173738 2564 generic.go:358] "Generic (PLEG): container finished" podID="feccf819-1f73-4905-98a9-0ee9212a368c" containerID="d14fbecdefbbd33ec57c2cb1a1b3c10f8983c794df70b3a2b54559696b115d48" exitCode=0 Apr 22 18:01:34.174109 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:34.173831 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" event={"ID":"feccf819-1f73-4905-98a9-0ee9212a368c","Type":"ContainerDied","Data":"d14fbecdefbbd33ec57c2cb1a1b3c10f8983c794df70b3a2b54559696b115d48"} Apr 22 18:01:35.178329 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:35.178296 2564 generic.go:358] "Generic (PLEG): container finished" podID="feccf819-1f73-4905-98a9-0ee9212a368c" containerID="1d7a0e11cde92521acb18a7c3a27f380c6a374b6840f67f1900f4115e649020a" exitCode=0 Apr 22 18:01:35.178783 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:35.178379 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" event={"ID":"feccf819-1f73-4905-98a9-0ee9212a368c","Type":"ContainerDied","Data":"1d7a0e11cde92521acb18a7c3a27f380c6a374b6840f67f1900f4115e649020a"} Apr 22 18:01:36.298054 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.298033 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:36.369470 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.369447 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-bundle\") pod \"feccf819-1f73-4905-98a9-0ee9212a368c\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " Apr 22 18:01:36.369599 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.369492 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t52l\" (UniqueName: \"kubernetes.io/projected/feccf819-1f73-4905-98a9-0ee9212a368c-kube-api-access-6t52l\") pod \"feccf819-1f73-4905-98a9-0ee9212a368c\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " Apr 22 18:01:36.369599 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.369579 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-util\") pod \"feccf819-1f73-4905-98a9-0ee9212a368c\" (UID: \"feccf819-1f73-4905-98a9-0ee9212a368c\") " Apr 22 18:01:36.370405 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.370381 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-bundle" (OuterVolumeSpecName: "bundle") pod "feccf819-1f73-4905-98a9-0ee9212a368c" (UID: "feccf819-1f73-4905-98a9-0ee9212a368c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:36.371438 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.371412 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feccf819-1f73-4905-98a9-0ee9212a368c-kube-api-access-6t52l" (OuterVolumeSpecName: "kube-api-access-6t52l") pod "feccf819-1f73-4905-98a9-0ee9212a368c" (UID: "feccf819-1f73-4905-98a9-0ee9212a368c"). InnerVolumeSpecName "kube-api-access-6t52l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:01:36.374945 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.374923 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-util" (OuterVolumeSpecName: "util") pod "feccf819-1f73-4905-98a9-0ee9212a368c" (UID: "feccf819-1f73-4905-98a9-0ee9212a368c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:36.470423 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.470368 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:01:36.470423 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.470391 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/feccf819-1f73-4905-98a9-0ee9212a368c-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:01:36.470423 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:36.470400 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6t52l\" (UniqueName: \"kubernetes.io/projected/feccf819-1f73-4905-98a9-0ee9212a368c-kube-api-access-6t52l\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:01:37.186354 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:37.186323 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" event={"ID":"feccf819-1f73-4905-98a9-0ee9212a368c","Type":"ContainerDied","Data":"a1e3ca28127ea52e56c1b968c11e8522bce60b43177260ee8112dbbf94263d16"} Apr 22 18:01:37.186354 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:37.186356 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e3ca28127ea52e56c1b968c11e8522bce60b43177260ee8112dbbf94263d16" Apr 22 18:01:37.186562 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:37.186331 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lhzk4" Apr 22 18:01:50.564909 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.564831 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-z29q8"] Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565096 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="extract" Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565107 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="extract" Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565115 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="pull" Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565121 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="pull" Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565133 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="util" Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565138 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="util" Apr 22 18:01:50.565314 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.565184 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="feccf819-1f73-4905-98a9-0ee9212a368c" containerName="extract" Apr 22 18:01:50.567886 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.567871 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.570481 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.570456 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 18:01:50.570975 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.570957 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-fsl49\"" Apr 22 18:01:50.571071 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.571017 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 18:01:50.584864 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.584839 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-z29q8"] Apr 22 18:01:50.672147 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.672122 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qft7m\" (UniqueName: \"kubernetes.io/projected/37dc0afd-96cc-42a9-a28f-5407b558efeb-kube-api-access-qft7m\") pod \"servicemesh-operator3-55f49c5f94-z29q8\" (UID: \"37dc0afd-96cc-42a9-a28f-5407b558efeb\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.672274 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.672156 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/37dc0afd-96cc-42a9-a28f-5407b558efeb-operator-config\") pod \"servicemesh-operator3-55f49c5f94-z29q8\" (UID: \"37dc0afd-96cc-42a9-a28f-5407b558efeb\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.773407 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.773382 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qft7m\" (UniqueName: \"kubernetes.io/projected/37dc0afd-96cc-42a9-a28f-5407b558efeb-kube-api-access-qft7m\") pod \"servicemesh-operator3-55f49c5f94-z29q8\" (UID: \"37dc0afd-96cc-42a9-a28f-5407b558efeb\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.773502 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.773421 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/37dc0afd-96cc-42a9-a28f-5407b558efeb-operator-config\") pod \"servicemesh-operator3-55f49c5f94-z29q8\" (UID: \"37dc0afd-96cc-42a9-a28f-5407b558efeb\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.776030 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.776008 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/37dc0afd-96cc-42a9-a28f-5407b558efeb-operator-config\") pod \"servicemesh-operator3-55f49c5f94-z29q8\" (UID: \"37dc0afd-96cc-42a9-a28f-5407b558efeb\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.783381 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.783355 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qft7m\" (UniqueName: \"kubernetes.io/projected/37dc0afd-96cc-42a9-a28f-5407b558efeb-kube-api-access-qft7m\") pod \"servicemesh-operator3-55f49c5f94-z29q8\" (UID: \"37dc0afd-96cc-42a9-a28f-5407b558efeb\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:50.876589 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:50.876528 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:51.006960 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.006936 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-z29q8"] Apr 22 18:01:51.009962 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:01:51.009933 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37dc0afd_96cc_42a9_a28f_5407b558efeb.slice/crio-93d73db7cf5936ad3b8865479442ca00ff70e16207b2cbe57cfd75358933bb10 WatchSource:0}: Error finding container 93d73db7cf5936ad3b8865479442ca00ff70e16207b2cbe57cfd75358933bb10: Status 404 returned error can't find the container with id 93d73db7cf5936ad3b8865479442ca00ff70e16207b2cbe57cfd75358933bb10 Apr 22 18:01:51.231489 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.231454 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" event={"ID":"37dc0afd-96cc-42a9-a28f-5407b558efeb","Type":"ContainerStarted","Data":"93d73db7cf5936ad3b8865479442ca00ff70e16207b2cbe57cfd75358933bb10"} Apr 22 18:01:51.684144 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.684107 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv"] Apr 22 18:01:51.687599 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.687577 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.690853 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.690827 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:01:51.691286 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.691190 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fkw8k\"" Apr 22 18:01:51.691993 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.691926 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:01:51.702609 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.702587 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv"] Apr 22 18:01:51.781206 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.781175 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.781392 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.781237 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hscdp\" (UniqueName: \"kubernetes.io/projected/4de37504-ad8c-48fd-b7b1-a2fb61610043-kube-api-access-hscdp\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.781392 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.781334 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.882254 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.882218 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.882426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.882289 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hscdp\" (UniqueName: \"kubernetes.io/projected/4de37504-ad8c-48fd-b7b1-a2fb61610043-kube-api-access-hscdp\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.882426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.882337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.882730 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.882661 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.882730 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.882713 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:51.891743 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:51.891714 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hscdp\" (UniqueName: \"kubernetes.io/projected/4de37504-ad8c-48fd-b7b1-a2fb61610043-kube-api-access-hscdp\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:52.001224 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:52.001144 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:52.183914 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:52.183860 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv"] Apr 22 18:01:52.186078 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:01:52.186051 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de37504_ad8c_48fd_b7b1_a2fb61610043.slice/crio-284be0f141671521aaaa0114f673b61f5ea7b68e860c976f52f90a95f2921396 WatchSource:0}: Error finding container 284be0f141671521aaaa0114f673b61f5ea7b68e860c976f52f90a95f2921396: Status 404 returned error can't find the container with id 284be0f141671521aaaa0114f673b61f5ea7b68e860c976f52f90a95f2921396 Apr 22 18:01:52.237325 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:52.237296 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" event={"ID":"4de37504-ad8c-48fd-b7b1-a2fb61610043","Type":"ContainerStarted","Data":"284be0f141671521aaaa0114f673b61f5ea7b68e860c976f52f90a95f2921396"} Apr 22 18:01:53.242376 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:53.242343 2564 generic.go:358] "Generic (PLEG): container finished" podID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerID="b536abeea21da63dccb70001cd5d660f68a3a58470616cb332905983d03bde47" exitCode=0 Apr 22 18:01:53.242840 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:53.242427 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" event={"ID":"4de37504-ad8c-48fd-b7b1-a2fb61610043","Type":"ContainerDied","Data":"b536abeea21da63dccb70001cd5d660f68a3a58470616cb332905983d03bde47"} Apr 22 18:01:54.248625 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.247868 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" event={"ID":"37dc0afd-96cc-42a9-a28f-5407b558efeb","Type":"ContainerStarted","Data":"143ca51a1a8d23b6c6c3a46d071e829e063af074358357a52b6673e87c0c1d13"} Apr 22 18:01:54.248625 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.248577 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:01:54.268913 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.268850 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" podStartSLOduration=1.84804743 podStartE2EDuration="4.268833147s" podCreationTimestamp="2026-04-22 18:01:50 +0000 UTC" firstStartedPulling="2026-04-22 18:01:51.01235766 +0000 UTC m=+531.071260780" lastFinishedPulling="2026-04-22 18:01:53.43314337 +0000 UTC m=+533.492046497" observedRunningTime="2026-04-22 18:01:54.266396719 +0000 UTC m=+534.325299862" watchObservedRunningTime="2026-04-22 18:01:54.268833147 +0000 UTC m=+534.327736293" Apr 22 18:01:54.473494 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.473462 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf"] Apr 22 18:01:54.476608 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.476586 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.479443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479283 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:01:54.479443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479297 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:01:54.479443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479289 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:01:54.479443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479336 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 18:01:54.479443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479350 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-tttxl\"" Apr 22 18:01:54.479775 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479581 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 18:01:54.479775 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.479604 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 18:01:54.487287 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.487265 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf"] Apr 22 18:01:54.605789 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.605766 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.605898 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.605808 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.605898 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.605831 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfgb5\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-kube-api-access-nfgb5\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.605973 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.605905 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.605973 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.605940 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.606043 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.605997 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.606043 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.606018 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.706738 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.706706 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.706738 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.706748 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.707000 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.706808 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.707000 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.706830 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.707000 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.706870 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.707165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.707012 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.707165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.707052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfgb5\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-kube-api-access-nfgb5\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.707696 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.707652 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.709139 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.709119 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.709321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.709298 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.709464 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.709447 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.709522 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.709487 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.714785 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.714760 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.715280 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.715262 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfgb5\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-kube-api-access-nfgb5\") pod \"istiod-openshift-gateway-7cd77c7ffd-wggmf\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:54.786468 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:54.786393 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:55.122923 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:55.122843 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf"] Apr 22 18:01:55.126727 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:01:55.126701 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1022e95_8f5a_4a1c_a90b_03dd2937eb2d.slice/crio-b61be683d5d8581e13e644df5ebae9763ebc0bef9f62f012606580e6b469974b WatchSource:0}: Error finding container b61be683d5d8581e13e644df5ebae9763ebc0bef9f62f012606580e6b469974b: Status 404 returned error can't find the container with id b61be683d5d8581e13e644df5ebae9763ebc0bef9f62f012606580e6b469974b Apr 22 18:01:55.252550 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:55.252512 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" event={"ID":"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d","Type":"ContainerStarted","Data":"b61be683d5d8581e13e644df5ebae9763ebc0bef9f62f012606580e6b469974b"} Apr 22 18:01:55.254178 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:55.254151 2564 generic.go:358] "Generic (PLEG): container finished" podID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerID="82164886805c27a3b7f470e36cd523aa70e28a005a59aafda4db42e340be865a" exitCode=0 Apr 22 18:01:55.254315 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:55.254237 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" event={"ID":"4de37504-ad8c-48fd-b7b1-a2fb61610043","Type":"ContainerDied","Data":"82164886805c27a3b7f470e36cd523aa70e28a005a59aafda4db42e340be865a"} Apr 22 18:01:56.261163 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:56.261121 2564 generic.go:358] "Generic (PLEG): container finished" podID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerID="798d0abd01ee1a88542db5fef57b043f532afbbca29a981d5614f4d22856fe61" exitCode=0 Apr 22 18:01:56.261610 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:56.261205 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" event={"ID":"4de37504-ad8c-48fd-b7b1-a2fb61610043","Type":"ContainerDied","Data":"798d0abd01ee1a88542db5fef57b043f532afbbca29a981d5614f4d22856fe61"} Apr 22 18:01:57.953330 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:57.953301 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:58.003684 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.003625 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:01:58.003813 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.003728 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:01:58.136331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.136292 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-bundle\") pod \"4de37504-ad8c-48fd-b7b1-a2fb61610043\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " Apr 22 18:01:58.136470 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.136342 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-util\") pod \"4de37504-ad8c-48fd-b7b1-a2fb61610043\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " Apr 22 18:01:58.136470 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.136460 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hscdp\" (UniqueName: \"kubernetes.io/projected/4de37504-ad8c-48fd-b7b1-a2fb61610043-kube-api-access-hscdp\") pod \"4de37504-ad8c-48fd-b7b1-a2fb61610043\" (UID: \"4de37504-ad8c-48fd-b7b1-a2fb61610043\") " Apr 22 18:01:58.137208 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.137183 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-bundle" (OuterVolumeSpecName: "bundle") pod "4de37504-ad8c-48fd-b7b1-a2fb61610043" (UID: "4de37504-ad8c-48fd-b7b1-a2fb61610043"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:58.138466 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.138437 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de37504-ad8c-48fd-b7b1-a2fb61610043-kube-api-access-hscdp" (OuterVolumeSpecName: "kube-api-access-hscdp") pod "4de37504-ad8c-48fd-b7b1-a2fb61610043" (UID: "4de37504-ad8c-48fd-b7b1-a2fb61610043"). InnerVolumeSpecName "kube-api-access-hscdp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:01:58.140753 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.140725 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-util" (OuterVolumeSpecName: "util") pod "4de37504-ad8c-48fd-b7b1-a2fb61610043" (UID: "4de37504-ad8c-48fd-b7b1-a2fb61610043"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:58.237163 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.237088 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hscdp\" (UniqueName: \"kubernetes.io/projected/4de37504-ad8c-48fd-b7b1-a2fb61610043-kube-api-access-hscdp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:01:58.237163 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.237115 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:01:58.237163 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.237125 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4de37504-ad8c-48fd-b7b1-a2fb61610043-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:01:58.270431 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.270396 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" event={"ID":"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d","Type":"ContainerStarted","Data":"c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f"} Apr 22 18:01:58.270611 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.270474 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:01:58.272087 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.272067 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" Apr 22 18:01:58.272193 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.272066 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb2xdwv" event={"ID":"4de37504-ad8c-48fd-b7b1-a2fb61610043","Type":"ContainerDied","Data":"284be0f141671521aaaa0114f673b61f5ea7b68e860c976f52f90a95f2921396"} Apr 22 18:01:58.272193 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.272169 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="284be0f141671521aaaa0114f673b61f5ea7b68e860c976f52f90a95f2921396" Apr 22 18:01:58.295553 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:58.295507 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" podStartSLOduration=1.420608702 podStartE2EDuration="4.295496398s" podCreationTimestamp="2026-04-22 18:01:54 +0000 UTC" firstStartedPulling="2026-04-22 18:01:55.12852047 +0000 UTC m=+535.187423590" lastFinishedPulling="2026-04-22 18:01:58.003408155 +0000 UTC m=+538.062311286" observedRunningTime="2026-04-22 18:01:58.293844857 +0000 UTC m=+538.352747998" watchObservedRunningTime="2026-04-22 18:01:58.295496398 +0000 UTC m=+538.354399540" Apr 22 18:01:59.277220 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:01:59.277185 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:02:01.018806 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.018769 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc"] Apr 22 18:02:01.019165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019072 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="util" Apr 22 18:02:01.019165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019083 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="util" Apr 22 18:02:01.019165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019103 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="pull" Apr 22 18:02:01.019165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019108 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="pull" Apr 22 18:02:01.019165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019114 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="extract" Apr 22 18:02:01.019165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019120 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="extract" Apr 22 18:02:01.019347 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.019169 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4de37504-ad8c-48fd-b7b1-a2fb61610043" containerName="extract" Apr 22 18:02:01.025215 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.025193 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.028167 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.028137 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-lsqcs\"" Apr 22 18:02:01.045019 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.044995 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc"] Apr 22 18:02:01.162388 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162354 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162526 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162578 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162521 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162578 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162551 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162650 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162589 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162650 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162620 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162756 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162646 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162756 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162721 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.162756 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.162748 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmr8\" (UniqueName: \"kubernetes.io/projected/cf03a492-f3f9-4b00-b992-f7ad5d885fea-kube-api-access-pwmr8\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263629 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263594 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263827 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263636 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263827 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263659 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263827 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263718 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263827 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263748 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263827 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.263827 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263814 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264123 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263843 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmr8\" (UniqueName: \"kubernetes.io/projected/cf03a492-f3f9-4b00-b992-f7ad5d885fea-kube-api-access-pwmr8\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264123 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.263881 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264123 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.264090 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264332 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.264236 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264389 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.264354 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264443 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.264419 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.264630 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.264608 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.266272 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.266241 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.266426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.266411 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.273525 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.273474 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf03a492-f3f9-4b00-b992-f7ad5d885fea-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.273914 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.273891 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmr8\" (UniqueName: \"kubernetes.io/projected/cf03a492-f3f9-4b00-b992-f7ad5d885fea-kube-api-access-pwmr8\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc\" (UID: \"cf03a492-f3f9-4b00-b992-f7ad5d885fea\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.337408 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.337377 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:01.464782 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:01.464760 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc"] Apr 22 18:02:01.467070 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:01.467039 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf03a492_f3f9_4b00_b992_f7ad5d885fea.slice/crio-4e6f69fde6ac87c094aa2c6ba51a7fdff0f80d515233bbf5971215ff491d9f3c WatchSource:0}: Error finding container 4e6f69fde6ac87c094aa2c6ba51a7fdff0f80d515233bbf5971215ff491d9f3c: Status 404 returned error can't find the container with id 4e6f69fde6ac87c094aa2c6ba51a7fdff0f80d515233bbf5971215ff491d9f3c Apr 22 18:02:02.287492 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:02.287461 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" event={"ID":"cf03a492-f3f9-4b00-b992-f7ad5d885fea","Type":"ContainerStarted","Data":"4e6f69fde6ac87c094aa2c6ba51a7fdff0f80d515233bbf5971215ff491d9f3c"} Apr 22 18:02:04.446949 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:04.446909 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:02:04.447227 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:04.446982 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:02:04.447227 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:04.447008 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:02:05.298647 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:05.298569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" event={"ID":"cf03a492-f3f9-4b00-b992-f7ad5d885fea","Type":"ContainerStarted","Data":"d56b7f9ccccbc314b0fe526d465971b148cebc7b309b61b13ba95bfb6203ce6c"} Apr 22 18:02:05.332090 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:05.332038 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" podStartSLOduration=2.354801996 podStartE2EDuration="5.332025227s" podCreationTimestamp="2026-04-22 18:02:00 +0000 UTC" firstStartedPulling="2026-04-22 18:02:01.46947006 +0000 UTC m=+541.528373181" lastFinishedPulling="2026-04-22 18:02:04.446693278 +0000 UTC m=+544.505596412" observedRunningTime="2026-04-22 18:02:05.327571559 +0000 UTC m=+545.386474701" watchObservedRunningTime="2026-04-22 18:02:05.332025227 +0000 UTC m=+545.390928369" Apr 22 18:02:05.337834 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:05.337811 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:05.342294 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:05.342273 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:06.263499 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:06.263469 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-z29q8" Apr 22 18:02:06.302762 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:06.302734 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:06.303694 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:06.303658 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc" Apr 22 18:02:11.097088 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.097048 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn"] Apr 22 18:02:11.100982 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.100966 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.103853 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.103825 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:02:11.103853 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.103825 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:02:11.104900 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.104873 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fkw8k\"" Apr 22 18:02:11.110904 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.110883 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn"] Apr 22 18:02:11.134112 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.134092 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.134214 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.134118 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfvb\" (UniqueName: \"kubernetes.io/projected/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-kube-api-access-ppfvb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.134278 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.134210 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.198333 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.198309 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57"] Apr 22 18:02:11.201842 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.201828 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.222998 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.222975 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57"] Apr 22 18:02:11.235060 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235039 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.235150 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235070 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.235150 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdvwf\" (UniqueName: \"kubernetes.io/projected/627e612f-5a42-4b14-9d95-aacb32616ce1-kube-api-access-jdvwf\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.235238 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235149 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.235238 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfvb\" (UniqueName: \"kubernetes.io/projected/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-kube-api-access-ppfvb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.235339 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235267 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.235394 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235379 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.235446 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.235421 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.249917 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.249897 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfvb\" (UniqueName: \"kubernetes.io/projected/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-kube-api-access-ppfvb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.325974 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.325948 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g"] Apr 22 18:02:11.329630 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.329615 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.335692 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.335655 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdvwf\" (UniqueName: \"kubernetes.io/projected/627e612f-5a42-4b14-9d95-aacb32616ce1-kube-api-access-jdvwf\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.336126 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.336106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.336276 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.336255 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr4h5\" (UniqueName: \"kubernetes.io/projected/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-kube-api-access-pr4h5\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.336438 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.336413 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.336584 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.336565 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.336809 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.336787 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.336809 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.336800 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.337131 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.337115 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.338850 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.338829 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g"] Apr 22 18:02:11.345901 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.345879 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdvwf\" (UniqueName: \"kubernetes.io/projected/627e612f-5a42-4b14-9d95-aacb32616ce1-kube-api-access-jdvwf\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.386656 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.386602 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn"] Apr 22 18:02:11.390168 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.390150 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.398258 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.398227 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn"] Apr 22 18:02:11.410970 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.410949 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:11.437125 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437101 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.437259 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437132 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.437259 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437186 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.437259 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437205 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7g76\" (UniqueName: \"kubernetes.io/projected/a8df08f1-2497-4d01-9394-97ab597622b4-kube-api-access-l7g76\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.437259 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437227 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr4h5\" (UniqueName: \"kubernetes.io/projected/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-kube-api-access-pr4h5\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.437486 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437259 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.437486 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437472 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.437594 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.437505 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.445034 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.445011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr4h5\" (UniqueName: \"kubernetes.io/projected/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-kube-api-access-pr4h5\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.510783 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.510758 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:11.532697 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.532646 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn"] Apr 22 18:02:11.534909 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:11.534882 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3055bf_1ff1_4ac9_892a_26b73b55f9cb.slice/crio-d4fdd12f9f2f7784bd96653b24f999879244f4be6b695160d271256ca00b825b WatchSource:0}: Error finding container d4fdd12f9f2f7784bd96653b24f999879244f4be6b695160d271256ca00b825b: Status 404 returned error can't find the container with id d4fdd12f9f2f7784bd96653b24f999879244f4be6b695160d271256ca00b825b Apr 22 18:02:11.537786 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.537764 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.537887 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.537807 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.537948 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.537887 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7g76\" (UniqueName: \"kubernetes.io/projected/a8df08f1-2497-4d01-9394-97ab597622b4-kube-api-access-l7g76\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.538154 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.538130 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.538221 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.538175 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.547882 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.547860 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7g76\" (UniqueName: \"kubernetes.io/projected/a8df08f1-2497-4d01-9394-97ab597622b4-kube-api-access-l7g76\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.637602 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.637578 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57"] Apr 22 18:02:11.639219 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.639201 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:11.639508 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:11.639484 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod627e612f_5a42_4b14_9d95_aacb32616ce1.slice/crio-11a6a29491dca6a66bb0a362b94a45f57f364d2e528b225beeb5ad659b3f11dc WatchSource:0}: Error finding container 11a6a29491dca6a66bb0a362b94a45f57f364d2e528b225beeb5ad659b3f11dc: Status 404 returned error can't find the container with id 11a6a29491dca6a66bb0a362b94a45f57f364d2e528b225beeb5ad659b3f11dc Apr 22 18:02:11.701622 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.701597 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:11.819294 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.818256 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g"] Apr 22 18:02:11.853722 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:11.853608 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn"] Apr 22 18:02:11.855580 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:11.855547 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3d834f_f6d0_40c2_8c09_5a5f39de8e7d.slice/crio-3b3f64999493dbfa98bc71269dcb2551a4285ae638f02c943ab5db90df66bace WatchSource:0}: Error finding container 3b3f64999493dbfa98bc71269dcb2551a4285ae638f02c943ab5db90df66bace: Status 404 returned error can't find the container with id 3b3f64999493dbfa98bc71269dcb2551a4285ae638f02c943ab5db90df66bace Apr 22 18:02:11.856010 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:11.855986 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8df08f1_2497_4d01_9394_97ab597622b4.slice/crio-88a3bf6fad3e2aa6a5332128bc2bcd94d31276bffa37e3808dda8f8fe5f29675 WatchSource:0}: Error finding container 88a3bf6fad3e2aa6a5332128bc2bcd94d31276bffa37e3808dda8f8fe5f29675: Status 404 returned error can't find the container with id 88a3bf6fad3e2aa6a5332128bc2bcd94d31276bffa37e3808dda8f8fe5f29675 Apr 22 18:02:12.325276 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.325240 2564 generic.go:358] "Generic (PLEG): container finished" podID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerID="a2f04627da6326236571753d33d34b9046120b76fa305150db062a1d5e09803d" exitCode=0 Apr 22 18:02:12.325690 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.325331 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" event={"ID":"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d","Type":"ContainerDied","Data":"a2f04627da6326236571753d33d34b9046120b76fa305150db062a1d5e09803d"} Apr 22 18:02:12.325690 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.325361 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" event={"ID":"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d","Type":"ContainerStarted","Data":"3b3f64999493dbfa98bc71269dcb2551a4285ae638f02c943ab5db90df66bace"} Apr 22 18:02:12.326747 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.326696 2564 generic.go:358] "Generic (PLEG): container finished" podID="a8df08f1-2497-4d01-9394-97ab597622b4" containerID="10cac9f8304de1a8342f5809f747c4704e1bd354146cf7e99ddb6b53be4de6bf" exitCode=0 Apr 22 18:02:12.326814 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.326789 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" event={"ID":"a8df08f1-2497-4d01-9394-97ab597622b4","Type":"ContainerDied","Data":"10cac9f8304de1a8342f5809f747c4704e1bd354146cf7e99ddb6b53be4de6bf"} Apr 22 18:02:12.326871 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.326826 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" event={"ID":"a8df08f1-2497-4d01-9394-97ab597622b4","Type":"ContainerStarted","Data":"88a3bf6fad3e2aa6a5332128bc2bcd94d31276bffa37e3808dda8f8fe5f29675"} Apr 22 18:02:12.328197 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.328176 2564 generic.go:358] "Generic (PLEG): container finished" podID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerID="1d7323d21b1911eab14dbd6a1af59024c036ab50673418b3b7765edd60f3c40c" exitCode=0 Apr 22 18:02:12.328290 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.328258 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" event={"ID":"627e612f-5a42-4b14-9d95-aacb32616ce1","Type":"ContainerDied","Data":"1d7323d21b1911eab14dbd6a1af59024c036ab50673418b3b7765edd60f3c40c"} Apr 22 18:02:12.328290 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.328287 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" event={"ID":"627e612f-5a42-4b14-9d95-aacb32616ce1","Type":"ContainerStarted","Data":"11a6a29491dca6a66bb0a362b94a45f57f364d2e528b225beeb5ad659b3f11dc"} Apr 22 18:02:12.329761 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.329732 2564 generic.go:358] "Generic (PLEG): container finished" podID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerID="e4ea5f8b550dc71846630e0b7dea4a23a9329ad5e764f895f0bc0ee9e07817d7" exitCode=0 Apr 22 18:02:12.329857 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.329783 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" event={"ID":"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb","Type":"ContainerDied","Data":"e4ea5f8b550dc71846630e0b7dea4a23a9329ad5e764f895f0bc0ee9e07817d7"} Apr 22 18:02:12.329857 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:12.329811 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" event={"ID":"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb","Type":"ContainerStarted","Data":"d4fdd12f9f2f7784bd96653b24f999879244f4be6b695160d271256ca00b825b"} Apr 22 18:02:14.341002 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.340971 2564 generic.go:358] "Generic (PLEG): container finished" podID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerID="8ac05999dd96f60bf93bcaef742686981288a28138d63054e1547c5310981b70" exitCode=0 Apr 22 18:02:14.341423 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.341065 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" event={"ID":"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d","Type":"ContainerDied","Data":"8ac05999dd96f60bf93bcaef742686981288a28138d63054e1547c5310981b70"} Apr 22 18:02:14.342593 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.342573 2564 generic.go:358] "Generic (PLEG): container finished" podID="a8df08f1-2497-4d01-9394-97ab597622b4" containerID="c4be9d4f2767867a852dbbbac52403fc01e98332e3a9c0e21247751e7d6e8d03" exitCode=0 Apr 22 18:02:14.342686 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.342652 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" event={"ID":"a8df08f1-2497-4d01-9394-97ab597622b4","Type":"ContainerDied","Data":"c4be9d4f2767867a852dbbbac52403fc01e98332e3a9c0e21247751e7d6e8d03"} Apr 22 18:02:14.344216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.344185 2564 generic.go:358] "Generic (PLEG): container finished" podID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerID="80437b3ed9add025029a01ff0046b746bc307e1ef776df6423a989756c0db270" exitCode=0 Apr 22 18:02:14.344310 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.344226 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" event={"ID":"627e612f-5a42-4b14-9d95-aacb32616ce1","Type":"ContainerDied","Data":"80437b3ed9add025029a01ff0046b746bc307e1ef776df6423a989756c0db270"} Apr 22 18:02:14.345695 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.345571 2564 generic.go:358] "Generic (PLEG): container finished" podID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerID="53f7152954bbe8706869bcc6cc9bb6108dff8252788369e9424c417ee07d2111" exitCode=0 Apr 22 18:02:14.345695 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:14.345631 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" event={"ID":"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb","Type":"ContainerDied","Data":"53f7152954bbe8706869bcc6cc9bb6108dff8252788369e9424c417ee07d2111"} Apr 22 18:02:15.351407 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.351372 2564 generic.go:358] "Generic (PLEG): container finished" podID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerID="78e2b1009f8d2dbe860f96d4166684906eef58a050bfb208880614efaab7f575" exitCode=0 Apr 22 18:02:15.351807 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.351465 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" event={"ID":"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb","Type":"ContainerDied","Data":"78e2b1009f8d2dbe860f96d4166684906eef58a050bfb208880614efaab7f575"} Apr 22 18:02:15.353154 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.353129 2564 generic.go:358] "Generic (PLEG): container finished" podID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerID="87d08b98bdc088aa53f5e1c280d77bc7dc5b25b12f25fdbe437f51e88b6011a5" exitCode=0 Apr 22 18:02:15.353259 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.353210 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" event={"ID":"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d","Type":"ContainerDied","Data":"87d08b98bdc088aa53f5e1c280d77bc7dc5b25b12f25fdbe437f51e88b6011a5"} Apr 22 18:02:15.354796 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.354771 2564 generic.go:358] "Generic (PLEG): container finished" podID="a8df08f1-2497-4d01-9394-97ab597622b4" containerID="7c742f10e9dd53fdc3b4fbfbcad7481f0e2d71ef7fe46666a5d822f69f6cea6e" exitCode=0 Apr 22 18:02:15.354877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.354850 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" event={"ID":"a8df08f1-2497-4d01-9394-97ab597622b4","Type":"ContainerDied","Data":"7c742f10e9dd53fdc3b4fbfbcad7481f0e2d71ef7fe46666a5d822f69f6cea6e"} Apr 22 18:02:15.356516 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.356498 2564 generic.go:358] "Generic (PLEG): container finished" podID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerID="e81d4e9bba72a14825eb83e74b1393b85e1bbe1a5b164fa692eed0c9e6d7dd50" exitCode=0 Apr 22 18:02:15.356597 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:15.356529 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" event={"ID":"627e612f-5a42-4b14-9d95-aacb32616ce1","Type":"ContainerDied","Data":"e81d4e9bba72a14825eb83e74b1393b85e1bbe1a5b164fa692eed0c9e6d7dd50"} Apr 22 18:02:16.491047 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.490914 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:16.555572 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.555553 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:16.559514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.559499 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:16.562946 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.562931 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:16.576896 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.576864 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-util\") pod \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " Apr 22 18:02:16.576984 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.576922 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdvwf\" (UniqueName: \"kubernetes.io/projected/627e612f-5a42-4b14-9d95-aacb32616ce1-kube-api-access-jdvwf\") pod \"627e612f-5a42-4b14-9d95-aacb32616ce1\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " Apr 22 18:02:16.576984 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.576963 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-bundle\") pod \"627e612f-5a42-4b14-9d95-aacb32616ce1\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " Apr 22 18:02:16.577091 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.576997 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppfvb\" (UniqueName: \"kubernetes.io/projected/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-kube-api-access-ppfvb\") pod \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " Apr 22 18:02:16.577091 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577022 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-bundle\") pod \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " Apr 22 18:02:16.577091 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577057 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7g76\" (UniqueName: \"kubernetes.io/projected/a8df08f1-2497-4d01-9394-97ab597622b4-kube-api-access-l7g76\") pod \"a8df08f1-2497-4d01-9394-97ab597622b4\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " Apr 22 18:02:16.577239 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577090 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr4h5\" (UniqueName: \"kubernetes.io/projected/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-kube-api-access-pr4h5\") pod \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " Apr 22 18:02:16.577239 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577119 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-util\") pod \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\" (UID: \"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d\") " Apr 22 18:02:16.577239 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577164 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-util\") pod \"a8df08f1-2497-4d01-9394-97ab597622b4\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " Apr 22 18:02:16.577239 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577215 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-bundle\") pod \"a8df08f1-2497-4d01-9394-97ab597622b4\" (UID: \"a8df08f1-2497-4d01-9394-97ab597622b4\") " Apr 22 18:02:16.577439 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577243 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-util\") pod \"627e612f-5a42-4b14-9d95-aacb32616ce1\" (UID: \"627e612f-5a42-4b14-9d95-aacb32616ce1\") " Apr 22 18:02:16.577439 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.577278 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-bundle\") pod \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\" (UID: \"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb\") " Apr 22 18:02:16.579631 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.579601 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-bundle" (OuterVolumeSpecName: "bundle") pod "6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" (UID: "6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.580197 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.579850 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-bundle" (OuterVolumeSpecName: "bundle") pod "627e612f-5a42-4b14-9d95-aacb32616ce1" (UID: "627e612f-5a42-4b14-9d95-aacb32616ce1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.580197 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.579956 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627e612f-5a42-4b14-9d95-aacb32616ce1-kube-api-access-jdvwf" (OuterVolumeSpecName: "kube-api-access-jdvwf") pod "627e612f-5a42-4b14-9d95-aacb32616ce1" (UID: "627e612f-5a42-4b14-9d95-aacb32616ce1"). InnerVolumeSpecName "kube-api-access-jdvwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:02:16.581094 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.580592 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-bundle" (OuterVolumeSpecName: "bundle") pod "a8df08f1-2497-4d01-9394-97ab597622b4" (UID: "a8df08f1-2497-4d01-9394-97ab597622b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.581191 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.581112 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-bundle" (OuterVolumeSpecName: "bundle") pod "6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" (UID: "6f3055bf-1ff1-4ac9-892a-26b73b55f9cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.582225 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.582200 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8df08f1-2497-4d01-9394-97ab597622b4-kube-api-access-l7g76" (OuterVolumeSpecName: "kube-api-access-l7g76") pod "a8df08f1-2497-4d01-9394-97ab597622b4" (UID: "a8df08f1-2497-4d01-9394-97ab597622b4"). InnerVolumeSpecName "kube-api-access-l7g76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:02:16.585266 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.585228 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-util" (OuterVolumeSpecName: "util") pod "6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" (UID: "6f3055bf-1ff1-4ac9-892a-26b73b55f9cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.587113 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.587089 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-kube-api-access-ppfvb" (OuterVolumeSpecName: "kube-api-access-ppfvb") pod "6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" (UID: "6f3055bf-1ff1-4ac9-892a-26b73b55f9cb"). InnerVolumeSpecName "kube-api-access-ppfvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:02:16.587192 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.587114 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-kube-api-access-pr4h5" (OuterVolumeSpecName: "kube-api-access-pr4h5") pod "6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" (UID: "6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d"). InnerVolumeSpecName "kube-api-access-pr4h5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:02:16.587503 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.587478 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-util" (OuterVolumeSpecName: "util") pod "6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" (UID: "6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.587992 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.587970 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-util" (OuterVolumeSpecName: "util") pod "a8df08f1-2497-4d01-9394-97ab597622b4" (UID: "a8df08f1-2497-4d01-9394-97ab597622b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.588894 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.588869 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-util" (OuterVolumeSpecName: "util") pod "627e612f-5a42-4b14-9d95-aacb32616ce1" (UID: "627e612f-5a42-4b14-9d95-aacb32616ce1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:02:16.678910 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678887 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678914 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ppfvb\" (UniqueName: \"kubernetes.io/projected/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-kube-api-access-ppfvb\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678926 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678935 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7g76\" (UniqueName: \"kubernetes.io/projected/a8df08f1-2497-4d01-9394-97ab597622b4-kube-api-access-l7g76\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678946 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pr4h5\" (UniqueName: \"kubernetes.io/projected/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-kube-api-access-pr4h5\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678954 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678962 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678970 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8df08f1-2497-4d01-9394-97ab597622b4-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678977 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/627e612f-5a42-4b14-9d95-aacb32616ce1-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678985 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-bundle\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.678992 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3055bf-1ff1-4ac9-892a-26b73b55f9cb-util\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.679032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:16.679000 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdvwf\" (UniqueName: \"kubernetes.io/projected/627e612f-5a42-4b14-9d95-aacb32616ce1-kube-api-access-jdvwf\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:02:17.365638 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.365608 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" Apr 22 18:02:17.365836 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.365609 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503k2dvn" event={"ID":"a8df08f1-2497-4d01-9394-97ab597622b4","Type":"ContainerDied","Data":"88a3bf6fad3e2aa6a5332128bc2bcd94d31276bffa37e3808dda8f8fe5f29675"} Apr 22 18:02:17.365836 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.365708 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a3bf6fad3e2aa6a5332128bc2bcd94d31276bffa37e3808dda8f8fe5f29675" Apr 22 18:02:17.367361 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.367327 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" event={"ID":"627e612f-5a42-4b14-9d95-aacb32616ce1","Type":"ContainerDied","Data":"11a6a29491dca6a66bb0a362b94a45f57f364d2e528b225beeb5ad659b3f11dc"} Apr 22 18:02:17.367361 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.367356 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a6a29491dca6a66bb0a362b94a45f57f364d2e528b225beeb5ad659b3f11dc" Apr 22 18:02:17.367517 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.367370 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30jgm57" Apr 22 18:02:17.369189 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.369161 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" Apr 22 18:02:17.369189 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.369178 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvsldn" event={"ID":"6f3055bf-1ff1-4ac9-892a-26b73b55f9cb","Type":"ContainerDied","Data":"d4fdd12f9f2f7784bd96653b24f999879244f4be6b695160d271256ca00b825b"} Apr 22 18:02:17.369345 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.369203 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4fdd12f9f2f7784bd96653b24f999879244f4be6b695160d271256ca00b825b" Apr 22 18:02:17.371045 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.371024 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" event={"ID":"6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d","Type":"ContainerDied","Data":"3b3f64999493dbfa98bc71269dcb2551a4285ae638f02c943ab5db90df66bace"} Apr 22 18:02:17.371125 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.371050 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b3f64999493dbfa98bc71269dcb2551a4285ae638f02c943ab5db90df66bace" Apr 22 18:02:17.371125 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:17.371076 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8845l8g" Apr 22 18:02:23.084770 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.084739 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn"] Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085067 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="util" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085082 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="util" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085092 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="pull" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085097 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="pull" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085106 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="pull" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085112 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="pull" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085121 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="extract" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085127 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="extract" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085133 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="extract" Apr 22 18:02:23.085135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085138 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085148 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="pull" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085155 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="pull" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085169 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085174 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085181 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="util" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085186 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="util" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085191 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="util" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085196 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="util" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085202 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="pull" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085206 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="pull" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085212 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085216 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085222 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="util" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085226 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="util" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085272 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="627e612f-5a42-4b14-9d95-aacb32616ce1" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085281 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a3d834f-f6d0-40c2-8c09-5a5f39de8e7d" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085288 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8df08f1-2497-4d01-9394-97ab597622b4" containerName="extract" Apr 22 18:02:23.085426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.085303 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f3055bf-1ff1-4ac9-892a-26b73b55f9cb" containerName="extract" Apr 22 18:02:23.089441 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.089424 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:23.092806 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.092789 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:02:23.093736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.093710 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:02:23.093817 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.093794 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-7pmkw\"" Apr 22 18:02:23.112775 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.112748 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn"] Apr 22 18:02:23.125792 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.125762 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hqjm\" (UniqueName: \"kubernetes.io/projected/2c6f1aaa-3f4a-4629-af01-23439a1be786-kube-api-access-5hqjm\") pod \"limitador-operator-controller-manager-c7fb4c8d5-wt7bn\" (UID: \"2c6f1aaa-3f4a-4629-af01-23439a1be786\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:23.227109 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.227077 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hqjm\" (UniqueName: \"kubernetes.io/projected/2c6f1aaa-3f4a-4629-af01-23439a1be786-kube-api-access-5hqjm\") pod \"limitador-operator-controller-manager-c7fb4c8d5-wt7bn\" (UID: \"2c6f1aaa-3f4a-4629-af01-23439a1be786\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:23.241923 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.241893 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hqjm\" (UniqueName: \"kubernetes.io/projected/2c6f1aaa-3f4a-4629-af01-23439a1be786-kube-api-access-5hqjm\") pod \"limitador-operator-controller-manager-c7fb4c8d5-wt7bn\" (UID: \"2c6f1aaa-3f4a-4629-af01-23439a1be786\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:23.399659 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.399585 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:23.516268 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:23.516237 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn"] Apr 22 18:02:23.518173 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:23.518145 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c6f1aaa_3f4a_4629_af01_23439a1be786.slice/crio-59ddb7dba5c27f4fb61e873c6a793c14d6d76f17a0a5ad179a7f2b3112be0b7d WatchSource:0}: Error finding container 59ddb7dba5c27f4fb61e873c6a793c14d6d76f17a0a5ad179a7f2b3112be0b7d: Status 404 returned error can't find the container with id 59ddb7dba5c27f4fb61e873c6a793c14d6d76f17a0a5ad179a7f2b3112be0b7d Apr 22 18:02:24.396909 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:24.396844 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" event={"ID":"2c6f1aaa-3f4a-4629-af01-23439a1be786","Type":"ContainerStarted","Data":"59ddb7dba5c27f4fb61e873c6a793c14d6d76f17a0a5ad179a7f2b3112be0b7d"} Apr 22 18:02:26.401291 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.401215 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99"] Apr 22 18:02:26.404598 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.404575 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.405763 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.405738 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" event={"ID":"2c6f1aaa-3f4a-4629-af01-23439a1be786","Type":"ContainerStarted","Data":"bbeb0d12b8bf80ac8c77a82fd6ddeba2f597c176206822f930d8479ab2fce2f8"} Apr 22 18:02:26.405868 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.405843 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:26.407105 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.407092 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-cnj9k\"" Apr 22 18:02:26.419815 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.419793 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99"] Apr 22 18:02:26.454783 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.454763 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9zv\" (UniqueName: \"kubernetes.io/projected/f4c790fc-d1a0-4a75-86b4-64d57b85b767-kube-api-access-nr9zv\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jvl99\" (UID: \"f4c790fc-d1a0-4a75-86b4-64d57b85b767\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.454889 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.454799 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4c790fc-d1a0-4a75-86b4-64d57b85b767-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jvl99\" (UID: \"f4c790fc-d1a0-4a75-86b4-64d57b85b767\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.480010 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.479965 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" podStartSLOduration=1.073370235 podStartE2EDuration="3.479949061s" podCreationTimestamp="2026-04-22 18:02:23 +0000 UTC" firstStartedPulling="2026-04-22 18:02:23.520174388 +0000 UTC m=+563.579077508" lastFinishedPulling="2026-04-22 18:02:25.926753213 +0000 UTC m=+565.985656334" observedRunningTime="2026-04-22 18:02:26.478786794 +0000 UTC m=+566.537689934" watchObservedRunningTime="2026-04-22 18:02:26.479949061 +0000 UTC m=+566.538852199" Apr 22 18:02:26.556223 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.556191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9zv\" (UniqueName: \"kubernetes.io/projected/f4c790fc-d1a0-4a75-86b4-64d57b85b767-kube-api-access-nr9zv\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jvl99\" (UID: \"f4c790fc-d1a0-4a75-86b4-64d57b85b767\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.556223 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.556225 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4c790fc-d1a0-4a75-86b4-64d57b85b767-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jvl99\" (UID: \"f4c790fc-d1a0-4a75-86b4-64d57b85b767\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.556555 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.556539 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4c790fc-d1a0-4a75-86b4-64d57b85b767-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jvl99\" (UID: \"f4c790fc-d1a0-4a75-86b4-64d57b85b767\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.565064 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.565043 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9zv\" (UniqueName: \"kubernetes.io/projected/f4c790fc-d1a0-4a75-86b4-64d57b85b767-kube-api-access-nr9zv\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-jvl99\" (UID: \"f4c790fc-d1a0-4a75-86b4-64d57b85b767\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.714092 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.714062 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:26.848519 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:26.848476 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99"] Apr 22 18:02:26.848756 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:02:26.848728 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c790fc_d1a0_4a75_86b4_64d57b85b767.slice/crio-1602cbd7f7a662010bb390ac162c5d2941319b49161f90562011ad5d0f6f0d89 WatchSource:0}: Error finding container 1602cbd7f7a662010bb390ac162c5d2941319b49161f90562011ad5d0f6f0d89: Status 404 returned error can't find the container with id 1602cbd7f7a662010bb390ac162c5d2941319b49161f90562011ad5d0f6f0d89 Apr 22 18:02:27.410743 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:27.410695 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" event={"ID":"f4c790fc-d1a0-4a75-86b4-64d57b85b767","Type":"ContainerStarted","Data":"1602cbd7f7a662010bb390ac162c5d2941319b49161f90562011ad5d0f6f0d89"} Apr 22 18:02:31.428679 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:31.428632 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" event={"ID":"f4c790fc-d1a0-4a75-86b4-64d57b85b767","Type":"ContainerStarted","Data":"fe783b5b883bb5810b439592a012a476d23cdc9d7a31b9263c4df61b53046c8b"} Apr 22 18:02:31.429133 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:31.428710 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:02:31.456255 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:31.456199 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" podStartSLOduration=1.539267977 podStartE2EDuration="5.456180396s" podCreationTimestamp="2026-04-22 18:02:26 +0000 UTC" firstStartedPulling="2026-04-22 18:02:26.851095939 +0000 UTC m=+566.909999059" lastFinishedPulling="2026-04-22 18:02:30.768008355 +0000 UTC m=+570.826911478" observedRunningTime="2026-04-22 18:02:31.453431777 +0000 UTC m=+571.512334916" watchObservedRunningTime="2026-04-22 18:02:31.456180396 +0000 UTC m=+571.515083540" Apr 22 18:02:37.413725 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:37.413686 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wt7bn" Apr 22 18:02:42.435192 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:02:42.435161 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-jvl99" Apr 22 18:03:00.482825 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:00.482796 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:03:00.483820 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:00.483802 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:03:13.805823 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.805746 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:13.809504 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.809481 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:13.812985 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.812956 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:03:13.813327 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.813311 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-wdlw5\"" Apr 22 18:03:13.823719 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.823692 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:13.842421 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.842390 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tgd\" (UniqueName: \"kubernetes.io/projected/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-kube-api-access-s4tgd\") pod \"limitador-limitador-64c8f475fb-c779g\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:13.842532 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.842442 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-config-file\") pod \"limitador-limitador-64c8f475fb-c779g\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:13.885576 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.885522 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:13.943756 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.943722 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tgd\" (UniqueName: \"kubernetes.io/projected/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-kube-api-access-s4tgd\") pod \"limitador-limitador-64c8f475fb-c779g\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:13.943928 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.943771 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-config-file\") pod \"limitador-limitador-64c8f475fb-c779g\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:13.944316 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.944298 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-config-file\") pod \"limitador-limitador-64c8f475fb-c779g\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:13.952419 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:13.952398 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tgd\" (UniqueName: \"kubernetes.io/projected/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-kube-api-access-s4tgd\") pod \"limitador-limitador-64c8f475fb-c779g\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:14.121490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:14.121419 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:14.254436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:14.254414 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:14.256102 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:03:14.256077 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e1c9b4_d5e1_48e6_8e84_218b6ec29235.slice/crio-8d97054d9c75aca1b9ff2541f25c456ad4aeb729d12a927d19e2610911f65165 WatchSource:0}: Error finding container 8d97054d9c75aca1b9ff2541f25c456ad4aeb729d12a927d19e2610911f65165: Status 404 returned error can't find the container with id 8d97054d9c75aca1b9ff2541f25c456ad4aeb729d12a927d19e2610911f65165 Apr 22 18:03:14.258444 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:14.258426 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:03:14.585942 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:14.585903 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" event={"ID":"52e1c9b4-d5e1-48e6-8e84-218b6ec29235","Type":"ContainerStarted","Data":"8d97054d9c75aca1b9ff2541f25c456ad4aeb729d12a927d19e2610911f65165"} Apr 22 18:03:18.604236 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:18.604162 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" event={"ID":"52e1c9b4-d5e1-48e6-8e84-218b6ec29235","Type":"ContainerStarted","Data":"3b6afffce8c001d91ea932a1264683352d44dc290a4167db342267dac7a4de78"} Apr 22 18:03:18.604550 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:18.604301 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:18.624635 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:18.624585 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" podStartSLOduration=1.617082087 podStartE2EDuration="5.624572602s" podCreationTimestamp="2026-04-22 18:03:13 +0000 UTC" firstStartedPulling="2026-04-22 18:03:14.258609794 +0000 UTC m=+614.317512921" lastFinishedPulling="2026-04-22 18:03:18.266100313 +0000 UTC m=+618.325003436" observedRunningTime="2026-04-22 18:03:18.621961245 +0000 UTC m=+618.680864379" watchObservedRunningTime="2026-04-22 18:03:18.624572602 +0000 UTC m=+618.683475744" Apr 22 18:03:28.219252 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.219218 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:28.219780 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.219569 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" podUID="52e1c9b4-d5e1-48e6-8e84-218b6ec29235" containerName="limitador" containerID="cri-o://3b6afffce8c001d91ea932a1264683352d44dc290a4167db342267dac7a4de78" gracePeriod=30 Apr 22 18:03:28.220330 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.220309 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:28.646523 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.646493 2564 generic.go:358] "Generic (PLEG): container finished" podID="52e1c9b4-d5e1-48e6-8e84-218b6ec29235" containerID="3b6afffce8c001d91ea932a1264683352d44dc290a4167db342267dac7a4de78" exitCode=0 Apr 22 18:03:28.646651 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.646539 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" event={"ID":"52e1c9b4-d5e1-48e6-8e84-218b6ec29235","Type":"ContainerDied","Data":"3b6afffce8c001d91ea932a1264683352d44dc290a4167db342267dac7a4de78"} Apr 22 18:03:28.765784 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.765762 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:28.846543 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.846484 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4tgd\" (UniqueName: \"kubernetes.io/projected/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-kube-api-access-s4tgd\") pod \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " Apr 22 18:03:28.846658 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.846551 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-config-file\") pod \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\" (UID: \"52e1c9b4-d5e1-48e6-8e84-218b6ec29235\") " Apr 22 18:03:28.846906 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.846884 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-config-file" (OuterVolumeSpecName: "config-file") pod "52e1c9b4-d5e1-48e6-8e84-218b6ec29235" (UID: "52e1c9b4-d5e1-48e6-8e84-218b6ec29235"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:03:28.848477 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.848453 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-kube-api-access-s4tgd" (OuterVolumeSpecName: "kube-api-access-s4tgd") pod "52e1c9b4-d5e1-48e6-8e84-218b6ec29235" (UID: "52e1c9b4-d5e1-48e6-8e84-218b6ec29235"). InnerVolumeSpecName "kube-api-access-s4tgd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:28.947266 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.947237 2564 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-config-file\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:28.947266 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:28.947260 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4tgd\" (UniqueName: \"kubernetes.io/projected/52e1c9b4-d5e1-48e6-8e84-218b6ec29235-kube-api-access-s4tgd\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:29.651110 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:29.651075 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" event={"ID":"52e1c9b4-d5e1-48e6-8e84-218b6ec29235","Type":"ContainerDied","Data":"8d97054d9c75aca1b9ff2541f25c456ad4aeb729d12a927d19e2610911f65165"} Apr 22 18:03:29.651496 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:29.651120 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-c779g" Apr 22 18:03:29.651496 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:29.651123 2564 scope.go:117] "RemoveContainer" containerID="3b6afffce8c001d91ea932a1264683352d44dc290a4167db342267dac7a4de78" Apr 22 18:03:29.675744 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:29.675722 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:29.677390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:29.677365 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-c779g"] Apr 22 18:03:30.554087 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:30.554055 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e1c9b4-d5e1-48e6-8e84-218b6ec29235" path="/var/lib/kubelet/pods/52e1c9b4-d5e1-48e6-8e84-218b6ec29235/volumes" Apr 22 18:03:47.872965 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.872930 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x"] Apr 22 18:03:47.873415 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.873260 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52e1c9b4-d5e1-48e6-8e84-218b6ec29235" containerName="limitador" Apr 22 18:03:47.873415 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.873271 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e1c9b4-d5e1-48e6-8e84-218b6ec29235" containerName="limitador" Apr 22 18:03:47.873415 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.873327 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="52e1c9b4-d5e1-48e6-8e84-218b6ec29235" containerName="limitador" Apr 22 18:03:47.876324 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.876303 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.912642 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.912618 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x"] Apr 22 18:03:47.993945 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.993915 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a955e293-a029-4466-b738-2f13eb571d4b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.994065 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.993969 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64gmz\" (UniqueName: \"kubernetes.io/projected/a955e293-a029-4466-b738-2f13eb571d4b-kube-api-access-64gmz\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.994065 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.994019 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.994065 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.994038 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.994170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.994069 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.994170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.994084 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a955e293-a029-4466-b738-2f13eb571d4b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:47.994170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:47.994106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a955e293-a029-4466-b738-2f13eb571d4b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095429 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095397 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a955e293-a029-4466-b738-2f13eb571d4b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095550 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095438 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64gmz\" (UniqueName: \"kubernetes.io/projected/a955e293-a029-4466-b738-2f13eb571d4b-kube-api-access-64gmz\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095550 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095469 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095550 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095484 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095747 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095681 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095747 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095710 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a955e293-a029-4466-b738-2f13eb571d4b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.095747 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.095729 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a955e293-a029-4466-b738-2f13eb571d4b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.096505 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.096481 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a955e293-a029-4466-b738-2f13eb571d4b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.097871 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.097843 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.097963 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.097882 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a955e293-a029-4466-b738-2f13eb571d4b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.098287 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.098266 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.098331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.098292 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a955e293-a029-4466-b738-2f13eb571d4b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.104128 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.104106 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64gmz\" (UniqueName: \"kubernetes.io/projected/a955e293-a029-4466-b738-2f13eb571d4b-kube-api-access-64gmz\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.104282 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.104266 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a955e293-a029-4466-b738-2f13eb571d4b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8gm7x\" (UID: \"a955e293-a029-4466-b738-2f13eb571d4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.185279 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.185252 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.538856 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.538829 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x"] Apr 22 18:03:48.540714 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:03:48.540657 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda955e293_a029_4466_b738_2f13eb571d4b.slice/crio-3c72537a81e0f1584d8a52ac7589c35a5888fd6bda2bc6f7b08cb0e65348bae6 WatchSource:0}: Error finding container 3c72537a81e0f1584d8a52ac7589c35a5888fd6bda2bc6f7b08cb0e65348bae6: Status 404 returned error can't find the container with id 3c72537a81e0f1584d8a52ac7589c35a5888fd6bda2bc6f7b08cb0e65348bae6 Apr 22 18:03:48.542712 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.542681 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:03:48.542808 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.542738 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:03:48.722493 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.722451 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" event={"ID":"a955e293-a029-4466-b738-2f13eb571d4b","Type":"ContainerStarted","Data":"84612e4c0d2765436c89bdbb14212be2bec80f93f35b5265a4e585cfdd375e76"} Apr 22 18:03:48.722695 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.722500 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" event={"ID":"a955e293-a029-4466-b738-2f13eb571d4b","Type":"ContainerStarted","Data":"3c72537a81e0f1584d8a52ac7589c35a5888fd6bda2bc6f7b08cb0e65348bae6"} Apr 22 18:03:48.722695 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.722540 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:48.744739 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:48.744689 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" podStartSLOduration=1.744660104 podStartE2EDuration="1.744660104s" podCreationTimestamp="2026-04-22 18:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:03:48.74269699 +0000 UTC m=+648.801600136" watchObservedRunningTime="2026-04-22 18:03:48.744660104 +0000 UTC m=+648.803563249" Apr 22 18:03:49.729050 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:49.729023 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8gm7x" Apr 22 18:03:49.796273 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:49.796234 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf"] Apr 22 18:03:49.796473 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:49.796451 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" podUID="a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" containerName="discovery" containerID="cri-o://c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f" gracePeriod=30 Apr 22 18:03:50.046106 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.046084 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:03:50.115321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115288 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-token\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.115321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115326 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-cacerts\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.115514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115363 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-local-certs\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.115514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115401 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-kubeconfig\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.115514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115422 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-dns-cert\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.115514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115441 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfgb5\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-kube-api-access-nfgb5\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.115514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.115469 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-ca-configmap\") pod \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\" (UID: \"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d\") " Apr 22 18:03:50.116032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.116002 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:03:50.117852 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.117818 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:03:50.117852 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.117829 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:03:50.118007 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.117958 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-cacerts" (OuterVolumeSpecName: "cacerts") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:03:50.118007 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.117971 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-token" (OuterVolumeSpecName: "istio-token") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:50.118007 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.117988 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-local-certs" (OuterVolumeSpecName: "local-certs") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:03:50.118007 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.117992 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-kube-api-access-nfgb5" (OuterVolumeSpecName: "kube-api-access-nfgb5") pod "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" (UID: "a1022e95-8f5a-4a1c-a90b-03dd2937eb2d"). InnerVolumeSpecName "kube-api-access-nfgb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:50.216070 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216044 2564 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-token\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.216070 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216069 2564 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-cacerts\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.216216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216081 2564 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-local-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.216216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216090 2564 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-kubeconfig\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.216216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216099 2564 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-dns-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.216216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216107 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nfgb5\" (UniqueName: \"kubernetes.io/projected/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-kube-api-access-nfgb5\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.216216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.216119 2564 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d-istio-csr-ca-configmap\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:03:50.732860 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.732823 2564 generic.go:358] "Generic (PLEG): container finished" podID="a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" containerID="c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f" exitCode=0 Apr 22 18:03:50.733331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.732902 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" Apr 22 18:03:50.733331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.732917 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" event={"ID":"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d","Type":"ContainerDied","Data":"c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f"} Apr 22 18:03:50.733331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.732969 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf" event={"ID":"a1022e95-8f5a-4a1c-a90b-03dd2937eb2d","Type":"ContainerDied","Data":"b61be683d5d8581e13e644df5ebae9763ebc0bef9f62f012606580e6b469974b"} Apr 22 18:03:50.733331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.732985 2564 scope.go:117] "RemoveContainer" containerID="c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f" Apr 22 18:03:50.745851 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.745832 2564 scope.go:117] "RemoveContainer" containerID="c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f" Apr 22 18:03:50.746153 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:50.746133 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f\": container with ID starting with c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f not found: ID does not exist" containerID="c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f" Apr 22 18:03:50.746220 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.746161 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f"} err="failed to get container status \"c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f\": rpc error: code = NotFound desc = could not find container \"c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f\": container with ID starting with c6297b9f51e7401391eac9480ab32c41e644224a3b618fcd4ead4ba80488573f not found: ID does not exist" Apr 22 18:03:50.779786 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.779749 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf"] Apr 22 18:03:50.788297 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:50.788273 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-wggmf"] Apr 22 18:03:52.553975 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:52.553942 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" path="/var/lib/kubelet/pods/a1022e95-8f5a-4a1c-a90b-03dd2937eb2d/volumes" Apr 22 18:03:57.529185 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.529154 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5b589d76d4-pz7p6"] Apr 22 18:03:57.529536 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.529476 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" containerName="discovery" Apr 22 18:03:57.529536 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.529487 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" containerName="discovery" Apr 22 18:03:57.529606 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.529539 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1022e95-8f5a-4a1c-a90b-03dd2937eb2d" containerName="discovery" Apr 22 18:03:57.547915 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.547880 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5b589d76d4-pz7p6"] Apr 22 18:03:57.548080 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.548014 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:57.550907 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.550882 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:03:57.551876 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.551856 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-gj4th\"" Apr 22 18:03:57.552023 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.551914 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:03:57.552086 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.552058 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:03:57.554155 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.554132 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-jd9vw"] Apr 22 18:03:57.557871 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.557848 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.559945 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.559929 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:03:57.560042 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.560013 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-sqzx5\"" Apr 22 18:03:57.566069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.565535 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jd9vw"] Apr 22 18:03:57.680231 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.680197 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfmv4\" (UniqueName: \"kubernetes.io/projected/aec381df-ac02-4e32-bc98-84c3cc7a9da4-kube-api-access-zfmv4\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:57.680422 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.680250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a-data\") pod \"seaweedfs-86cc847c5c-jd9vw\" (UID: \"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a\") " pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.680422 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.680278 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgk5t\" (UniqueName: \"kubernetes.io/projected/d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a-kube-api-access-wgk5t\") pod \"seaweedfs-86cc847c5c-jd9vw\" (UID: \"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a\") " pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.680422 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.680353 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:57.781127 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.781046 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfmv4\" (UniqueName: \"kubernetes.io/projected/aec381df-ac02-4e32-bc98-84c3cc7a9da4-kube-api-access-zfmv4\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:57.781127 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.781103 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a-data\") pod \"seaweedfs-86cc847c5c-jd9vw\" (UID: \"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a\") " pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.781300 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.781228 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgk5t\" (UniqueName: \"kubernetes.io/projected/d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a-kube-api-access-wgk5t\") pod \"seaweedfs-86cc847c5c-jd9vw\" (UID: \"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a\") " pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.781300 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.781282 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:57.781403 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:57.781385 2564 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 18:03:57.781478 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:57.781466 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert podName:aec381df-ac02-4e32-bc98-84c3cc7a9da4 nodeName:}" failed. No retries permitted until 2026-04-22 18:03:58.281446508 +0000 UTC m=+658.340349642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert") pod "llmisvc-controller-manager-5b589d76d4-pz7p6" (UID: "aec381df-ac02-4e32-bc98-84c3cc7a9da4") : secret "llmisvc-webhook-server-cert" not found Apr 22 18:03:57.781478 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.781466 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a-data\") pod \"seaweedfs-86cc847c5c-jd9vw\" (UID: \"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a\") " pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.789932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.789908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgk5t\" (UniqueName: \"kubernetes.io/projected/d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a-kube-api-access-wgk5t\") pod \"seaweedfs-86cc847c5c-jd9vw\" (UID: \"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a\") " pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.790043 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.790011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfmv4\" (UniqueName: \"kubernetes.io/projected/aec381df-ac02-4e32-bc98-84c3cc7a9da4-kube-api-access-zfmv4\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:57.870519 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.870495 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:03:57.995334 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:57.995309 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jd9vw"] Apr 22 18:03:57.997245 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:03:57.997218 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dd739a_946e_4fc7_b3a3_0cc2a0d5387a.slice/crio-9797970c198c77c6061e4e5796194b1b7613dff0ad03c9bdc80f474e8b23cf51 WatchSource:0}: Error finding container 9797970c198c77c6061e4e5796194b1b7613dff0ad03c9bdc80f474e8b23cf51: Status 404 returned error can't find the container with id 9797970c198c77c6061e4e5796194b1b7613dff0ad03c9bdc80f474e8b23cf51 Apr 22 18:03:58.285749 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:58.285718 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:58.285907 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:58.285865 2564 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 18:03:58.285945 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:58.285923 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert podName:aec381df-ac02-4e32-bc98-84c3cc7a9da4 nodeName:}" failed. No retries permitted until 2026-04-22 18:03:59.285905808 +0000 UTC m=+659.344809127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert") pod "llmisvc-controller-manager-5b589d76d4-pz7p6" (UID: "aec381df-ac02-4e32-bc98-84c3cc7a9da4") : secret "llmisvc-webhook-server-cert" not found Apr 22 18:03:58.773689 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:58.773621 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jd9vw" event={"ID":"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a","Type":"ContainerStarted","Data":"9797970c198c77c6061e4e5796194b1b7613dff0ad03c9bdc80f474e8b23cf51"} Apr 22 18:03:59.293321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:59.293283 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:59.295605 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:59.295579 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") pod \"llmisvc-controller-manager-5b589d76d4-pz7p6\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:59.360326 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:59.360282 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:03:59.499868 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:59.499842 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5b589d76d4-pz7p6"] Apr 22 18:03:59.501264 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:03:59.501238 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaec381df_ac02_4e32_bc98_84c3cc7a9da4.slice/crio-39e32327cdf8e4c1d48edceb1f51b5bb68a320a3fe09f26178a3173ec02e8b64 WatchSource:0}: Error finding container 39e32327cdf8e4c1d48edceb1f51b5bb68a320a3fe09f26178a3173ec02e8b64: Status 404 returned error can't find the container with id 39e32327cdf8e4c1d48edceb1f51b5bb68a320a3fe09f26178a3173ec02e8b64 Apr 22 18:03:59.779248 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:03:59.779205 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" event={"ID":"aec381df-ac02-4e32-bc98-84c3cc7a9da4","Type":"ContainerStarted","Data":"39e32327cdf8e4c1d48edceb1f51b5bb68a320a3fe09f26178a3173ec02e8b64"} Apr 22 18:03:59.970059 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:59.970004 2564 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e: reading manifest sha256:37852873744036552c1b5a85cf16a4f52bb45b54b0ffa3bdb3308c55c8fc7b53 in quay.io/opendatahub/odh-kserve-llmisvc-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e" Apr 22 18:03:59.970310 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:59.970266 2564 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e,Command:[/manager],Args:[--metrics-addr=127.0.0.1:8443 --leader-elect],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zfmv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod llmisvc-controller-manager-5b589d76d4-pz7p6_kserve(aec381df-ac02-4e32-bc98-84c3cc7a9da4): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e: reading manifest sha256:37852873744036552c1b5a85cf16a4f52bb45b54b0ffa3bdb3308c55c8fc7b53 in quay.io/opendatahub/odh-kserve-llmisvc-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 18:03:59.971433 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:03:59.971396 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e: reading manifest sha256:37852873744036552c1b5a85cf16a4f52bb45b54b0ffa3bdb3308c55c8fc7b53 in quay.io/opendatahub/odh-kserve-llmisvc-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" Apr 22 18:04:00.586394 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:00.586372 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:04:00.784241 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:00.784155 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jd9vw" event={"ID":"d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a","Type":"ContainerStarted","Data":"85a44b5d9885f8dbb3a26d3cc7424237fcbb90d72558958249ab226cce257528"} Apr 22 18:04:00.784241 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:00.784215 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:04:00.784748 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:04:00.784721 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-kserve-llmisvc-controller@sha256:2d10e2c5c79509c8b62e111d31068c2b64d0d5bee91f48c5bc3c2e6241e5ee5e: reading manifest sha256:37852873744036552c1b5a85cf16a4f52bb45b54b0ffa3bdb3308c55c8fc7b53 in quay.io/opendatahub/odh-kserve-llmisvc-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" Apr 22 18:04:00.817141 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:00.817088 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-jd9vw" podStartSLOduration=1.231691141 podStartE2EDuration="3.81707275s" podCreationTimestamp="2026-04-22 18:03:57 +0000 UTC" firstStartedPulling="2026-04-22 18:03:57.998484744 +0000 UTC m=+658.057387865" lastFinishedPulling="2026-04-22 18:04:00.583866355 +0000 UTC m=+660.642769474" observedRunningTime="2026-04-22 18:04:00.816092195 +0000 UTC m=+660.874995347" watchObservedRunningTime="2026-04-22 18:04:00.81707275 +0000 UTC m=+660.875975918" Apr 22 18:04:06.790780 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:06.790744 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-jd9vw" Apr 22 18:04:15.838643 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:15.838605 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" event={"ID":"aec381df-ac02-4e32-bc98-84c3cc7a9da4","Type":"ContainerStarted","Data":"c8a5f3404e510e660a81c5df14a78963845249cb4e8b00818445c36956ccabb5"} Apr 22 18:04:15.839076 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:15.838823 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:04:15.857070 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:15.857027 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" podStartSLOduration=3.583666936 podStartE2EDuration="18.857014982s" podCreationTimestamp="2026-04-22 18:03:57 +0000 UTC" firstStartedPulling="2026-04-22 18:03:59.502572606 +0000 UTC m=+659.561475727" lastFinishedPulling="2026-04-22 18:04:14.775920654 +0000 UTC m=+674.834823773" observedRunningTime="2026-04-22 18:04:15.855639866 +0000 UTC m=+675.914543007" watchObservedRunningTime="2026-04-22 18:04:15.857014982 +0000 UTC m=+675.915918124" Apr 22 18:04:46.846952 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:04:46.846879 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:05:21.849898 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.849866 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-2kmcl"] Apr 22 18:05:21.853634 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.853611 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:21.856514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.856494 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-m29b5\"" Apr 22 18:05:21.856624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.856496 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:05:21.862002 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.861980 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2kmcl"] Apr 22 18:05:21.872085 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.872051 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e86232-7d05-4c7d-8cda-94d1656d1324-tls-certs\") pod \"model-serving-api-86f7b4b499-2kmcl\" (UID: \"97e86232-7d05-4c7d-8cda-94d1656d1324\") " pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:21.872344 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.872320 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqvh\" (UniqueName: \"kubernetes.io/projected/97e86232-7d05-4c7d-8cda-94d1656d1324-kube-api-access-vnqvh\") pod \"model-serving-api-86f7b4b499-2kmcl\" (UID: \"97e86232-7d05-4c7d-8cda-94d1656d1324\") " pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:21.972997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.972958 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqvh\" (UniqueName: \"kubernetes.io/projected/97e86232-7d05-4c7d-8cda-94d1656d1324-kube-api-access-vnqvh\") pod \"model-serving-api-86f7b4b499-2kmcl\" (UID: \"97e86232-7d05-4c7d-8cda-94d1656d1324\") " pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:21.973164 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.973032 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e86232-7d05-4c7d-8cda-94d1656d1324-tls-certs\") pod \"model-serving-api-86f7b4b499-2kmcl\" (UID: \"97e86232-7d05-4c7d-8cda-94d1656d1324\") " pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:21.975377 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.975344 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e86232-7d05-4c7d-8cda-94d1656d1324-tls-certs\") pod \"model-serving-api-86f7b4b499-2kmcl\" (UID: \"97e86232-7d05-4c7d-8cda-94d1656d1324\") " pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:21.982710 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:21.982686 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqvh\" (UniqueName: \"kubernetes.io/projected/97e86232-7d05-4c7d-8cda-94d1656d1324-kube-api-access-vnqvh\") pod \"model-serving-api-86f7b4b499-2kmcl\" (UID: \"97e86232-7d05-4c7d-8cda-94d1656d1324\") " pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:22.173662 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:22.173630 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:22.299449 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:22.299423 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2kmcl"] Apr 22 18:05:22.300769 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:05:22.300742 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e86232_7d05_4c7d_8cda_94d1656d1324.slice/crio-bbb1536c6169865e7bf0272dc03f3559b50f76e62698e11a9a668377d2692bea WatchSource:0}: Error finding container bbb1536c6169865e7bf0272dc03f3559b50f76e62698e11a9a668377d2692bea: Status 404 returned error can't find the container with id bbb1536c6169865e7bf0272dc03f3559b50f76e62698e11a9a668377d2692bea Apr 22 18:05:23.085392 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:23.085348 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2kmcl" event={"ID":"97e86232-7d05-4c7d-8cda-94d1656d1324","Type":"ContainerStarted","Data":"bbb1536c6169865e7bf0272dc03f3559b50f76e62698e11a9a668377d2692bea"} Apr 22 18:05:24.089924 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:24.089877 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2kmcl" event={"ID":"97e86232-7d05-4c7d-8cda-94d1656d1324","Type":"ContainerStarted","Data":"946cf8d85b78347ac5264ffecc81e0508aaa96d4c3e757c7bb80dd0ad77580de"} Apr 22 18:05:24.090304 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:24.089950 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:24.107946 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:24.107890 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-2kmcl" podStartSLOduration=1.545404308 podStartE2EDuration="3.107877515s" podCreationTimestamp="2026-04-22 18:05:21 +0000 UTC" firstStartedPulling="2026-04-22 18:05:22.302347879 +0000 UTC m=+742.361251002" lastFinishedPulling="2026-04-22 18:05:23.864821089 +0000 UTC m=+743.923724209" observedRunningTime="2026-04-22 18:05:24.106232494 +0000 UTC m=+744.165135638" watchObservedRunningTime="2026-04-22 18:05:24.107877515 +0000 UTC m=+744.166780658" Apr 22 18:05:35.098010 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:35.097969 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-2kmcl" Apr 22 18:05:58.151057 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.151022 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd"] Apr 22 18:05:58.155927 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.155897 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.158997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.158654 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:05:58.158997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.158780 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 18:05:58.158997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.158791 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-lcvlf\"" Apr 22 18:05:58.158997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.158941 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:05:58.168425 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.168403 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd"] Apr 22 18:05:58.277801 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277769 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.277968 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277814 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8f95\" (UniqueName: \"kubernetes.io/projected/576b68b8-c039-4804-b52d-0677069a2ec0-kube-api-access-q8f95\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.277968 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277892 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.277968 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277936 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/576b68b8-c039-4804-b52d-0677069a2ec0-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.277968 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277962 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.278146 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277979 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.278146 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.277995 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.278146 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.278011 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/576b68b8-c039-4804-b52d-0677069a2ec0-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.278146 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.278085 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/576b68b8-c039-4804-b52d-0677069a2ec0-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379455 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379418 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379455 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379454 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379480 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/576b68b8-c039-4804-b52d-0677069a2ec0-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379653 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/576b68b8-c039-4804-b52d-0677069a2ec0-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379784 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8f95\" (UniqueName: \"kubernetes.io/projected/576b68b8-c039-4804-b52d-0677069a2ec0-kube-api-access-q8f95\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379836 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.379877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.379875 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.380233 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.380002 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.380233 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.380040 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/576b68b8-c039-4804-b52d-0677069a2ec0-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.380233 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.380175 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.380431 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.380410 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.380499 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.380481 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/576b68b8-c039-4804-b52d-0677069a2ec0-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.381849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.381829 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/576b68b8-c039-4804-b52d-0677069a2ec0-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.382121 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.382102 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/576b68b8-c039-4804-b52d-0677069a2ec0-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.390605 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.390578 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8f95\" (UniqueName: \"kubernetes.io/projected/576b68b8-c039-4804-b52d-0677069a2ec0-kube-api-access-q8f95\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.392367 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.392339 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/576b68b8-c039-4804-b52d-0677069a2ec0-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-bnxcd\" (UID: \"576b68b8-c039-4804-b52d-0677069a2ec0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.470624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.470594 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:58.627175 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.627135 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd"] Apr 22 18:05:58.629041 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:05:58.629010 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576b68b8_c039_4804_b52d_0677069a2ec0.slice/crio-100d92b3ef24614de1bb89f1e5896359416251b7547fe7552aac4c24befe8552 WatchSource:0}: Error finding container 100d92b3ef24614de1bb89f1e5896359416251b7547fe7552aac4c24befe8552: Status 404 returned error can't find the container with id 100d92b3ef24614de1bb89f1e5896359416251b7547fe7552aac4c24befe8552 Apr 22 18:05:58.631375 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.631337 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:05:58.631851 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.631825 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:05:58.632009 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:58.631992 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:05:59.223410 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:59.223373 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" event={"ID":"576b68b8-c039-4804-b52d-0677069a2ec0","Type":"ContainerStarted","Data":"34fa5dffc235c89375ca730ba879001ffd2c89d563b330fe17a26b1dc72a930d"} Apr 22 18:05:59.223410 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:59.223415 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" event={"ID":"576b68b8-c039-4804-b52d-0677069a2ec0","Type":"ContainerStarted","Data":"100d92b3ef24614de1bb89f1e5896359416251b7547fe7552aac4c24befe8552"} Apr 22 18:05:59.243844 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:59.243798 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" podStartSLOduration=1.24378213 podStartE2EDuration="1.24378213s" podCreationTimestamp="2026-04-22 18:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:05:59.240818899 +0000 UTC m=+779.299722064" watchObservedRunningTime="2026-04-22 18:05:59.24378213 +0000 UTC m=+779.302685273" Apr 22 18:05:59.471561 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:59.471529 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:05:59.476556 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:05:59.476492 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:06:00.227035 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:00.226998 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:06:00.227905 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:00.227884 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-bnxcd" Apr 22 18:06:07.182624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.182596 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l"] Apr 22 18:06:07.186754 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.186733 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.190243 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.190224 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 18:06:07.190358 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.190225 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:06:07.190358 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.190242 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-qvg5p\"" Apr 22 18:06:07.204122 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.204097 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l"] Apr 22 18:06:07.249506 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.249479 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.249658 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.249518 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.249658 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.249540 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.249658 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.249614 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.249844 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.249691 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.249844 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.249717 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv4mr\" (UniqueName: \"kubernetes.io/projected/ff8141e6-9786-4be3-8d73-0a82d97736d5-kube-api-access-jv4mr\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350483 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350445 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350641 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350491 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350641 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350605 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350794 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350645 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350794 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350723 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350794 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350753 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv4mr\" (UniqueName: \"kubernetes.io/projected/ff8141e6-9786-4be3-8d73-0a82d97736d5-kube-api-access-jv4mr\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.350949 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.350905 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.351115 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.351089 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.351287 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.351122 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.351287 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.351274 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.353268 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.353246 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.359041 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.359018 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv4mr\" (UniqueName: \"kubernetes.io/projected/ff8141e6-9786-4be3-8d73-0a82d97736d5-kube-api-access-jv4mr\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.497184 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.497099 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:07.626243 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:07.626219 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l"] Apr 22 18:06:07.627662 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:06:07.627631 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff8141e6_9786_4be3_8d73_0a82d97736d5.slice/crio-35653d5afecd49fa967dfc4702d2d23329a843474dc3496243b617b9fcba2c6e WatchSource:0}: Error finding container 35653d5afecd49fa967dfc4702d2d23329a843474dc3496243b617b9fcba2c6e: Status 404 returned error can't find the container with id 35653d5afecd49fa967dfc4702d2d23329a843474dc3496243b617b9fcba2c6e Apr 22 18:06:08.262304 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:08.262261 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerStarted","Data":"35653d5afecd49fa967dfc4702d2d23329a843474dc3496243b617b9fcba2c6e"} Apr 22 18:06:12.283305 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:12.283266 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerStarted","Data":"b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17"} Apr 22 18:06:13.287840 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:13.287804 2564 generic.go:358] "Generic (PLEG): container finished" podID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerID="b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17" exitCode=0 Apr 22 18:06:13.288202 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:13.287889 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerDied","Data":"b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17"} Apr 22 18:06:15.298996 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:15.298956 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerStarted","Data":"e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5"} Apr 22 18:06:45.454429 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:45.454392 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerStarted","Data":"6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce"} Apr 22 18:06:45.455029 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:45.454551 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:45.456899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:45.456871 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:45.475402 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:45.475352 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" podStartSLOduration=0.815162316 podStartE2EDuration="38.475338035s" podCreationTimestamp="2026-04-22 18:06:07 +0000 UTC" firstStartedPulling="2026-04-22 18:06:07.629636982 +0000 UTC m=+787.688540108" lastFinishedPulling="2026-04-22 18:06:45.289812707 +0000 UTC m=+825.348715827" observedRunningTime="2026-04-22 18:06:45.47329706 +0000 UTC m=+825.532200226" watchObservedRunningTime="2026-04-22 18:06:45.475338035 +0000 UTC m=+825.534241179" Apr 22 18:06:47.498047 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:47.498007 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:47.498047 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:47.498048 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:57.499050 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:57.499019 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:57.500407 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:57.500383 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:06:58.630897 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:58.630861 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l"] Apr 22 18:06:58.636013 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:06:58.635983 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 22 18:06:58.636167 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:06:58.636068 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs podName:ff8141e6-9786-4be3-8d73-0a82d97736d5 nodeName:}" failed. No retries permitted until 2026-04-22 18:06:59.136046162 +0000 UTC m=+839.194949281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs") pod "scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 22 18:06:59.140823 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:06:59.140787 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 22 18:06:59.141015 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:06:59.140895 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs podName:ff8141e6-9786-4be3-8d73-0a82d97736d5 nodeName:}" failed. No retries permitted until 2026-04-22 18:07:00.140868374 +0000 UTC m=+840.199771501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs") pod "scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 22 18:06:59.507530 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:59.507489 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="main" containerID="cri-o://e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5" gracePeriod=30 Apr 22 18:06:59.507530 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:06:59.507506 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="tokenizer" containerID="cri-o://6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce" gracePeriod=30 Apr 22 18:07:00.151947 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:07:00.151909 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 22 18:07:00.152351 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:07:00.151982 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs podName:ff8141e6-9786-4be3-8d73-0a82d97736d5 nodeName:}" failed. No retries permitted until 2026-04-22 18:07:02.151966787 +0000 UTC m=+842.210869907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs") pod "scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 22 18:07:00.513386 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.513350 2564 generic.go:358] "Generic (PLEG): container finished" podID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerID="e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5" exitCode=0 Apr 22 18:07:00.513500 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.513425 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerDied","Data":"e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5"} Apr 22 18:07:00.646628 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.646605 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:07:00.755709 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755630 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-cache\") pod \"ff8141e6-9786-4be3-8d73-0a82d97736d5\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " Apr 22 18:07:00.755709 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755684 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-kserve-provision-location\") pod \"ff8141e6-9786-4be3-8d73-0a82d97736d5\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " Apr 22 18:07:00.755899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755720 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-tmp\") pod \"ff8141e6-9786-4be3-8d73-0a82d97736d5\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " Apr 22 18:07:00.755899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755743 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv4mr\" (UniqueName: \"kubernetes.io/projected/ff8141e6-9786-4be3-8d73-0a82d97736d5-kube-api-access-jv4mr\") pod \"ff8141e6-9786-4be3-8d73-0a82d97736d5\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " Apr 22 18:07:00.755899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755780 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs\") pod \"ff8141e6-9786-4be3-8d73-0a82d97736d5\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " Apr 22 18:07:00.755899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755835 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-uds\") pod \"ff8141e6-9786-4be3-8d73-0a82d97736d5\" (UID: \"ff8141e6-9786-4be3-8d73-0a82d97736d5\") " Apr 22 18:07:00.756112 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.755911 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ff8141e6-9786-4be3-8d73-0a82d97736d5" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:00.756112 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.756080 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:07:00.756112 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.756096 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ff8141e6-9786-4be3-8d73-0a82d97736d5" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:00.756230 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.756112 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ff8141e6-9786-4be3-8d73-0a82d97736d5" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:00.756415 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.756396 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ff8141e6-9786-4be3-8d73-0a82d97736d5" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:00.757956 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.757936 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8141e6-9786-4be3-8d73-0a82d97736d5-kube-api-access-jv4mr" (OuterVolumeSpecName: "kube-api-access-jv4mr") pod "ff8141e6-9786-4be3-8d73-0a82d97736d5" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5"). InnerVolumeSpecName "kube-api-access-jv4mr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:00.758046 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.757965 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ff8141e6-9786-4be3-8d73-0a82d97736d5" (UID: "ff8141e6-9786-4be3-8d73-0a82d97736d5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:00.857139 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.857114 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8141e6-9786-4be3-8d73-0a82d97736d5-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:07:00.857139 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.857136 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:07:00.857304 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.857146 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:07:00.857304 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.857155 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8141e6-9786-4be3-8d73-0a82d97736d5-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:07:00.857304 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:00.857166 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jv4mr\" (UniqueName: \"kubernetes.io/projected/ff8141e6-9786-4be3-8d73-0a82d97736d5-kube-api-access-jv4mr\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:07:01.519452 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.519416 2564 generic.go:358] "Generic (PLEG): container finished" podID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerID="6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce" exitCode=0 Apr 22 18:07:01.519904 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.519486 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" Apr 22 18:07:01.519904 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.519485 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerDied","Data":"6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce"} Apr 22 18:07:01.519904 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.519604 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l" event={"ID":"ff8141e6-9786-4be3-8d73-0a82d97736d5","Type":"ContainerDied","Data":"35653d5afecd49fa967dfc4702d2d23329a843474dc3496243b617b9fcba2c6e"} Apr 22 18:07:01.519904 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.519627 2564 scope.go:117] "RemoveContainer" containerID="6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce" Apr 22 18:07:01.528627 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.528610 2564 scope.go:117] "RemoveContainer" containerID="e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5" Apr 22 18:07:01.535954 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.535934 2564 scope.go:117] "RemoveContainer" containerID="b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17" Apr 22 18:07:01.543267 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.543242 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l"] Apr 22 18:07:01.544647 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.544630 2564 scope.go:117] "RemoveContainer" containerID="6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce" Apr 22 18:07:01.544957 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:07:01.544928 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce\": container with ID starting with 6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce not found: ID does not exist" containerID="6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce" Apr 22 18:07:01.545041 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.544970 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce"} err="failed to get container status \"6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce\": rpc error: code = NotFound desc = could not find container \"6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce\": container with ID starting with 6a062fe139d6248e0a6ab6eecf753b40235bbb0d42fbf0a90ca1d7060f1d91ce not found: ID does not exist" Apr 22 18:07:01.545041 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.544992 2564 scope.go:117] "RemoveContainer" containerID="e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5" Apr 22 18:07:01.545241 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:07:01.545216 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5\": container with ID starting with e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5 not found: ID does not exist" containerID="e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5" Apr 22 18:07:01.545301 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.545247 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5"} err="failed to get container status \"e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5\": rpc error: code = NotFound desc = could not find container \"e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5\": container with ID starting with e0d92b615981a0ab989086d4ef333889947eeb53f49019372e50eb8c21a87ad5 not found: ID does not exist" Apr 22 18:07:01.545301 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.545262 2564 scope.go:117] "RemoveContainer" containerID="b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17" Apr 22 18:07:01.545495 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:07:01.545477 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17\": container with ID starting with b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17 not found: ID does not exist" containerID="b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17" Apr 22 18:07:01.545555 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.545500 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17"} err="failed to get container status \"b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17\": rpc error: code = NotFound desc = could not find container \"b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17\": container with ID starting with b9046832a718118521764e30b01862523c0493820d1db9b91d82f505a6ae9d17 not found: ID does not exist" Apr 22 18:07:01.546212 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:01.546191 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd89jfg5l"] Apr 22 18:07:02.553117 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:02.553083 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" path="/var/lib/kubelet/pods/ff8141e6-9786-4be3-8d73-0a82d97736d5/volumes" Apr 22 18:07:17.917906 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.917869 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v"] Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918222 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="tokenizer" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918233 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="tokenizer" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918244 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="main" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918249 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="main" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918256 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="storage-initializer" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918262 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="storage-initializer" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918315 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="main" Apr 22 18:07:17.918398 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.918324 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff8141e6-9786-4be3-8d73-0a82d97736d5" containerName="tokenizer" Apr 22 18:07:17.996490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.996450 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v"] Apr 22 18:07:17.996654 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:17.996581 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.000504 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.000480 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-g295z\"" Apr 22 18:07:18.000622 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.000526 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:07:18.000622 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.000538 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 18:07:18.099773 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.099732 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.099950 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.099816 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.099950 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.099862 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.099950 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.099922 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.100118 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.099953 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.100118 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.100056 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5kgv\" (UniqueName: \"kubernetes.io/projected/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kube-api-access-b5kgv\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.200964 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.200935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201114 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.200971 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201114 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.200996 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201114 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201036 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201114 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201268 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201213 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5kgv\" (UniqueName: \"kubernetes.io/projected/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kube-api-access-b5kgv\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201385 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201352 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201517 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201405 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201517 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201431 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.201517 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.201486 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.203498 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.203476 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.208757 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.208736 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5kgv\" (UniqueName: \"kubernetes.io/projected/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kube-api-access-b5kgv\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.306809 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.306784 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:18.433637 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.433609 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v"] Apr 22 18:07:18.435172 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:07:18.435086 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d35f04e_8fac_4086_98e3_aa5b30e3da46.slice/crio-7fa5fbfc694f19eb187fe472a3a7d6b22d6fcfe73765a967bd462cadfa062b72 WatchSource:0}: Error finding container 7fa5fbfc694f19eb187fe472a3a7d6b22d6fcfe73765a967bd462cadfa062b72: Status 404 returned error can't find the container with id 7fa5fbfc694f19eb187fe472a3a7d6b22d6fcfe73765a967bd462cadfa062b72 Apr 22 18:07:18.582149 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.582116 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerStarted","Data":"93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21"} Apr 22 18:07:18.582299 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:18.582154 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerStarted","Data":"7fa5fbfc694f19eb187fe472a3a7d6b22d6fcfe73765a967bd462cadfa062b72"} Apr 22 18:07:19.587292 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:19.587199 2564 generic.go:358] "Generic (PLEG): container finished" podID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerID="93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21" exitCode=0 Apr 22 18:07:19.587292 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:19.587241 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerDied","Data":"93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21"} Apr 22 18:07:20.592326 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:20.592290 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerStarted","Data":"960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86"} Apr 22 18:07:20.592326 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:20.592322 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerStarted","Data":"61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8"} Apr 22 18:07:20.592866 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:20.592413 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:20.614164 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:20.614107 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" podStartSLOduration=3.614094492 podStartE2EDuration="3.614094492s" podCreationTimestamp="2026-04-22 18:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:07:20.611749268 +0000 UTC m=+860.670652410" watchObservedRunningTime="2026-04-22 18:07:20.614094492 +0000 UTC m=+860.672997633" Apr 22 18:07:28.307845 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:28.307809 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:28.307845 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:28.307846 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:28.310563 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:28.310538 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:28.624655 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:28.624580 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:07:49.628983 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:07:49.628910 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:08:00.514638 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:00.514607 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:08:00.516017 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:00.515995 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:08:02.902503 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.902472 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz"] Apr 22 18:08:02.907608 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.907585 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:02.910283 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.910265 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 18:08:02.910380 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.910312 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-tjl56\"" Apr 22 18:08:02.916914 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.916889 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz"] Apr 22 18:08:02.980818 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.980787 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:02.981014 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.980833 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:02.981014 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.980878 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:02.981014 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.980927 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:02.981014 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.981009 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:02.981331 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:02.981046 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4pl\" (UniqueName: \"kubernetes.io/projected/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kube-api-access-sg4pl\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082234 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082192 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082234 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082233 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082500 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082416 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082500 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082454 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4pl\" (UniqueName: \"kubernetes.io/projected/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kube-api-access-sg4pl\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082612 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082509 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082612 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082741 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082648 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082783 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082760 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082851 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.082934 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.082916 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.084618 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.084601 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.091312 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.091286 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4pl\" (UniqueName: \"kubernetes.io/projected/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kube-api-access-sg4pl\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.217477 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.217444 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:03.351865 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.351837 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz"] Apr 22 18:08:03.352800 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:08:03.352778 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525b62df_1e2c_46d3_a40c_ff5c0748a4c2.slice/crio-f7212625451512db42fa08558dd1c76f2b5b15bb03acac301c5982c4231621e7 WatchSource:0}: Error finding container f7212625451512db42fa08558dd1c76f2b5b15bb03acac301c5982c4231621e7: Status 404 returned error can't find the container with id f7212625451512db42fa08558dd1c76f2b5b15bb03acac301c5982c4231621e7 Apr 22 18:08:03.757093 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.757052 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerStarted","Data":"97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a"} Apr 22 18:08:03.757093 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:03.757094 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerStarted","Data":"f7212625451512db42fa08558dd1c76f2b5b15bb03acac301c5982c4231621e7"} Apr 22 18:08:04.761642 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:04.761607 2564 generic.go:358] "Generic (PLEG): container finished" podID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerID="97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a" exitCode=0 Apr 22 18:08:04.762043 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:04.761690 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerDied","Data":"97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a"} Apr 22 18:08:05.768433 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:05.768394 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerStarted","Data":"249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e"} Apr 22 18:08:05.768433 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:05.768437 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerStarted","Data":"6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504"} Apr 22 18:08:05.768978 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:05.768509 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:05.802940 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:05.802886 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" podStartSLOduration=3.802870985 podStartE2EDuration="3.802870985s" podCreationTimestamp="2026-04-22 18:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:05.798314032 +0000 UTC m=+905.857217173" watchObservedRunningTime="2026-04-22 18:08:05.802870985 +0000 UTC m=+905.861774334" Apr 22 18:08:12.287250 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:12.287216 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v"] Apr 22 18:08:12.287626 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:12.287585 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="main" containerID="cri-o://61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8" gracePeriod=30 Apr 22 18:08:12.287704 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:12.287618 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="tokenizer" containerID="cri-o://960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86" gracePeriod=30 Apr 22 18:08:12.797069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:12.797035 2564 generic.go:358] "Generic (PLEG): container finished" podID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerID="61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8" exitCode=0 Apr 22 18:08:12.797249 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:12.797075 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerDied","Data":"61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8"} Apr 22 18:08:13.217972 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.217941 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:13.218170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.217983 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:13.220785 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.220759 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:13.532566 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.532545 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:08:13.682139 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682108 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kserve-provision-location\") pod \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " Apr 22 18:08:13.682288 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682147 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-tmp\") pod \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " Apr 22 18:08:13.682288 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682197 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tls-certs\") pod \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " Apr 22 18:08:13.682288 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682219 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-cache\") pod \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " Apr 22 18:08:13.682288 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682236 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5kgv\" (UniqueName: \"kubernetes.io/projected/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kube-api-access-b5kgv\") pod \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " Apr 22 18:08:13.682288 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682267 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-uds\") pod \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\" (UID: \"4d35f04e-8fac-4086-98e3-aa5b30e3da46\") " Apr 22 18:08:13.682585 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682499 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4d35f04e-8fac-4086-98e3-aa5b30e3da46" (UID: "4d35f04e-8fac-4086-98e3-aa5b30e3da46"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.682641 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682597 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4d35f04e-8fac-4086-98e3-aa5b30e3da46" (UID: "4d35f04e-8fac-4086-98e3-aa5b30e3da46"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.682641 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.682609 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4d35f04e-8fac-4086-98e3-aa5b30e3da46" (UID: "4d35f04e-8fac-4086-98e3-aa5b30e3da46"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.683069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.683043 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d35f04e-8fac-4086-98e3-aa5b30e3da46" (UID: "4d35f04e-8fac-4086-98e3-aa5b30e3da46"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.684277 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.684259 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4d35f04e-8fac-4086-98e3-aa5b30e3da46" (UID: "4d35f04e-8fac-4086-98e3-aa5b30e3da46"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:13.684404 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.684374 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kube-api-access-b5kgv" (OuterVolumeSpecName: "kube-api-access-b5kgv") pod "4d35f04e-8fac-4086-98e3-aa5b30e3da46" (UID: "4d35f04e-8fac-4086-98e3-aa5b30e3da46"). InnerVolumeSpecName "kube-api-access-b5kgv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:13.783855 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.783788 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.783855 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.783812 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.783855 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.783823 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.783855 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.783833 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.783855 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.783842 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5kgv\" (UniqueName: \"kubernetes.io/projected/4d35f04e-8fac-4086-98e3-aa5b30e3da46-kube-api-access-b5kgv\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.783855 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.783851 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d35f04e-8fac-4086-98e3-aa5b30e3da46-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.802788 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.802757 2564 generic.go:358] "Generic (PLEG): container finished" podID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerID="960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86" exitCode=0 Apr 22 18:08:13.802955 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.802851 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" Apr 22 18:08:13.802955 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.802846 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerDied","Data":"960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86"} Apr 22 18:08:13.803074 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.802966 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v" event={"ID":"4d35f04e-8fac-4086-98e3-aa5b30e3da46","Type":"ContainerDied","Data":"7fa5fbfc694f19eb187fe472a3a7d6b22d6fcfe73765a967bd462cadfa062b72"} Apr 22 18:08:13.803074 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.802993 2564 scope.go:117] "RemoveContainer" containerID="960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86" Apr 22 18:08:13.804737 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.804714 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:13.814782 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.814743 2564 scope.go:117] "RemoveContainer" containerID="61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8" Apr 22 18:08:13.822458 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.822428 2564 scope.go:117] "RemoveContainer" containerID="93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21" Apr 22 18:08:13.829597 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.829574 2564 scope.go:117] "RemoveContainer" containerID="960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86" Apr 22 18:08:13.829865 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:08:13.829843 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86\": container with ID starting with 960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86 not found: ID does not exist" containerID="960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86" Apr 22 18:08:13.829926 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.829875 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86"} err="failed to get container status \"960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86\": rpc error: code = NotFound desc = could not find container \"960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86\": container with ID starting with 960e8549cbab7f197e1a880147697f655021cbd8da56b22cadb084ecb3d3dd86 not found: ID does not exist" Apr 22 18:08:13.829926 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.829895 2564 scope.go:117] "RemoveContainer" containerID="61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8" Apr 22 18:08:13.830122 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:08:13.830105 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8\": container with ID starting with 61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8 not found: ID does not exist" containerID="61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8" Apr 22 18:08:13.830174 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.830126 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8"} err="failed to get container status \"61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8\": rpc error: code = NotFound desc = could not find container \"61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8\": container with ID starting with 61e78d2806397271b215e58cf135a90e04ce831cd7e6651804c07e759bdc31f8 not found: ID does not exist" Apr 22 18:08:13.830174 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.830140 2564 scope.go:117] "RemoveContainer" containerID="93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21" Apr 22 18:08:13.830394 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:08:13.830376 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21\": container with ID starting with 93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21 not found: ID does not exist" containerID="93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21" Apr 22 18:08:13.830434 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.830406 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21"} err="failed to get container status \"93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21\": rpc error: code = NotFound desc = could not find container \"93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21\": container with ID starting with 93803429ada9d7c4fae97e67e204ccb0f2693d5dd4ec92f29f31f8362c3e3e21 not found: ID does not exist" Apr 22 18:08:13.850938 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.850904 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v"] Apr 22 18:08:13.855899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:13.855876 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76j6p8v"] Apr 22 18:08:14.554775 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:14.554743 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" path="/var/lib/kubelet/pods/4d35f04e-8fac-4086-98e3-aa5b30e3da46/volumes" Apr 22 18:08:17.393516 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393483 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6"] Apr 22 18:08:17.393899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393884 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="tokenizer" Apr 22 18:08:17.393995 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393901 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="tokenizer" Apr 22 18:08:17.393995 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393922 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="main" Apr 22 18:08:17.393995 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393928 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="main" Apr 22 18:08:17.393995 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393944 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="storage-initializer" Apr 22 18:08:17.393995 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.393950 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="storage-initializer" Apr 22 18:08:17.394158 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.394017 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="main" Apr 22 18:08:17.394158 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.394026 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d35f04e-8fac-4086-98e3-aa5b30e3da46" containerName="tokenizer" Apr 22 18:08:17.397145 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.397129 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.399738 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.399717 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 18:08:17.399861 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.399775 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-jdclt\"" Apr 22 18:08:17.408663 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.408640 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6"] Apr 22 18:08:17.515562 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.515523 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.515562 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.515564 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.515795 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.515593 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8klg\" (UniqueName: \"kubernetes.io/projected/3d038028-e486-4aca-9fc2-65042d3a1824-kube-api-access-l8klg\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.515795 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.515611 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.515795 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.515685 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.515899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.515804 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d038028-e486-4aca-9fc2-65042d3a1824-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616258 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616266 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8klg\" (UniqueName: \"kubernetes.io/projected/3d038028-e486-4aca-9fc2-65042d3a1824-kube-api-access-l8klg\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616286 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616400 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616611 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616530 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d038028-e486-4aca-9fc2-65042d3a1824-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616611 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616564 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616611 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616585 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616819 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616647 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616881 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616817 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.616920 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.616873 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.618813 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.618796 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d038028-e486-4aca-9fc2-65042d3a1824-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.624926 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.624905 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8klg\" (UniqueName: \"kubernetes.io/projected/3d038028-e486-4aca-9fc2-65042d3a1824-kube-api-access-l8klg\") pod \"precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:17.707854 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:17.707821 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:18.049248 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:18.049213 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6"] Apr 22 18:08:18.051539 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:08:18.051512 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d038028_e486_4aca_9fc2_65042d3a1824.slice/crio-2086fcbf5ef546c364bc85aa520e9f244bfe9271bb97aa935eaf5b064d1de17d WatchSource:0}: Error finding container 2086fcbf5ef546c364bc85aa520e9f244bfe9271bb97aa935eaf5b064d1de17d: Status 404 returned error can't find the container with id 2086fcbf5ef546c364bc85aa520e9f244bfe9271bb97aa935eaf5b064d1de17d Apr 22 18:08:18.053360 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:18.053343 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:08:18.824612 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:18.824581 2564 generic.go:358] "Generic (PLEG): container finished" podID="3d038028-e486-4aca-9fc2-65042d3a1824" containerID="749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5" exitCode=0 Apr 22 18:08:18.825063 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:18.824634 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerDied","Data":"749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5"} Apr 22 18:08:18.825063 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:18.824656 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerStarted","Data":"2086fcbf5ef546c364bc85aa520e9f244bfe9271bb97aa935eaf5b064d1de17d"} Apr 22 18:08:19.830187 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:19.830151 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerStarted","Data":"359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674"} Apr 22 18:08:19.830187 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:19.830186 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerStarted","Data":"f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4"} Apr 22 18:08:19.830742 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:19.830210 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:19.852530 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:19.852483 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" podStartSLOduration=2.852469744 podStartE2EDuration="2.852469744s" podCreationTimestamp="2026-04-22 18:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:19.850464757 +0000 UTC m=+919.909367897" watchObservedRunningTime="2026-04-22 18:08:19.852469744 +0000 UTC m=+919.911372929" Apr 22 18:08:27.708750 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:27.708704 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:27.709229 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:27.708783 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:27.710032 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:08:27.710010 2564 logging.go:55] [core] [Channel #59 SubChannel #60]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: connect: connection refused" Apr 22 18:08:27.711333 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:27.711313 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:27.862993 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:27.862960 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:28.709387 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:28.709343 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.50:9003\" within 1s: context deadline exceeded" Apr 22 18:08:34.810713 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:34.810654 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:08:37.708980 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:08:37.708956 2564 logging.go:55] [core] [Channel #67 SubChannel #68]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: connect: connection refused" Apr 22 18:08:38.709538 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:38.709495 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.50:9003\" within 1s: context deadline exceeded" Apr 22 18:08:38.709971 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:08:38.709823 2564 logging.go:55] [core] [Channel #67 SubChannel #68]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "error reading server preface: read tcp 10.133.0.2:58872->10.133.0.50:9003: use of closed network connection" Apr 22 18:08:49.871250 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:49.871220 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:50.968123 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:50.968088 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6"] Apr 22 18:08:50.968585 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:50.968510 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="main" containerID="cri-o://f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4" gracePeriod=30 Apr 22 18:08:50.968659 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:50.968576 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="tokenizer" containerID="cri-o://359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674" gracePeriod=30 Apr 22 18:08:51.959320 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:51.959288 2564 generic.go:358] "Generic (PLEG): container finished" podID="3d038028-e486-4aca-9fc2-65042d3a1824" containerID="f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4" exitCode=0 Apr 22 18:08:51.959482 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:51.959362 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerDied","Data":"f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4"} Apr 22 18:08:52.422151 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.422131 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:52.506980 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.506897 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-cache\") pod \"3d038028-e486-4aca-9fc2-65042d3a1824\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " Apr 22 18:08:52.506980 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.506947 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-uds\") pod \"3d038028-e486-4aca-9fc2-65042d3a1824\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " Apr 22 18:08:52.506980 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.506975 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d038028-e486-4aca-9fc2-65042d3a1824-tls-certs\") pod \"3d038028-e486-4aca-9fc2-65042d3a1824\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " Apr 22 18:08:52.507263 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507006 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-tmp\") pod \"3d038028-e486-4aca-9fc2-65042d3a1824\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " Apr 22 18:08:52.507263 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507044 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-kserve-provision-location\") pod \"3d038028-e486-4aca-9fc2-65042d3a1824\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " Apr 22 18:08:52.507263 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507126 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8klg\" (UniqueName: \"kubernetes.io/projected/3d038028-e486-4aca-9fc2-65042d3a1824-kube-api-access-l8klg\") pod \"3d038028-e486-4aca-9fc2-65042d3a1824\" (UID: \"3d038028-e486-4aca-9fc2-65042d3a1824\") " Apr 22 18:08:52.507263 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507217 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3d038028-e486-4aca-9fc2-65042d3a1824" (UID: "3d038028-e486-4aca-9fc2-65042d3a1824"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:52.507263 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507245 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3d038028-e486-4aca-9fc2-65042d3a1824" (UID: "3d038028-e486-4aca-9fc2-65042d3a1824"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:52.507494 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507372 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3d038028-e486-4aca-9fc2-65042d3a1824" (UID: "3d038028-e486-4aca-9fc2-65042d3a1824"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:52.507494 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507401 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.507494 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507420 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.507801 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.507776 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d038028-e486-4aca-9fc2-65042d3a1824" (UID: "3d038028-e486-4aca-9fc2-65042d3a1824"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:52.509098 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.509073 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d038028-e486-4aca-9fc2-65042d3a1824-kube-api-access-l8klg" (OuterVolumeSpecName: "kube-api-access-l8klg") pod "3d038028-e486-4aca-9fc2-65042d3a1824" (UID: "3d038028-e486-4aca-9fc2-65042d3a1824"). InnerVolumeSpecName "kube-api-access-l8klg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:52.509159 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.509142 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d038028-e486-4aca-9fc2-65042d3a1824-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3d038028-e486-4aca-9fc2-65042d3a1824" (UID: "3d038028-e486-4aca-9fc2-65042d3a1824"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:52.608093 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.608053 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.608093 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.608085 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8klg\" (UniqueName: \"kubernetes.io/projected/3d038028-e486-4aca-9fc2-65042d3a1824-kube-api-access-l8klg\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.608324 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.608101 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d038028-e486-4aca-9fc2-65042d3a1824-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.608324 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.608115 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d038028-e486-4aca-9fc2-65042d3a1824-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.968298 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.968267 2564 generic.go:358] "Generic (PLEG): container finished" podID="3d038028-e486-4aca-9fc2-65042d3a1824" containerID="359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674" exitCode=0 Apr 22 18:08:52.968499 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.968345 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" Apr 22 18:08:52.968499 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.968346 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerDied","Data":"359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674"} Apr 22 18:08:52.968499 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.968391 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6" event={"ID":"3d038028-e486-4aca-9fc2-65042d3a1824","Type":"ContainerDied","Data":"2086fcbf5ef546c364bc85aa520e9f244bfe9271bb97aa935eaf5b064d1de17d"} Apr 22 18:08:52.968499 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.968411 2564 scope.go:117] "RemoveContainer" containerID="359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674" Apr 22 18:08:52.976581 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.976557 2564 scope.go:117] "RemoveContainer" containerID="f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4" Apr 22 18:08:52.984736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.984716 2564 scope.go:117] "RemoveContainer" containerID="749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5" Apr 22 18:08:52.987909 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.987876 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6"] Apr 22 18:08:52.990307 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.990287 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-fd7687c5rlzs6"] Apr 22 18:08:52.992828 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.992812 2564 scope.go:117] "RemoveContainer" containerID="359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674" Apr 22 18:08:52.993109 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:08:52.993088 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674\": container with ID starting with 359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674 not found: ID does not exist" containerID="359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674" Apr 22 18:08:52.993183 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.993122 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674"} err="failed to get container status \"359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674\": rpc error: code = NotFound desc = could not find container \"359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674\": container with ID starting with 359e4130e9d49610f4ae772ea29d5c1efcfac5d5eeb1823789d634231aaa1674 not found: ID does not exist" Apr 22 18:08:52.993183 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.993151 2564 scope.go:117] "RemoveContainer" containerID="f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4" Apr 22 18:08:52.993387 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:08:52.993371 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4\": container with ID starting with f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4 not found: ID does not exist" containerID="f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4" Apr 22 18:08:52.993432 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.993392 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4"} err="failed to get container status \"f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4\": rpc error: code = NotFound desc = could not find container \"f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4\": container with ID starting with f2ad4ac8202d08999504e6d924f9e8ed3eacaf70c8a519efb841f20f7793e8f4 not found: ID does not exist" Apr 22 18:08:52.993432 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.993408 2564 scope.go:117] "RemoveContainer" containerID="749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5" Apr 22 18:08:52.993606 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:08:52.993590 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5\": container with ID starting with 749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5 not found: ID does not exist" containerID="749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5" Apr 22 18:08:52.993658 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:52.993614 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5"} err="failed to get container status \"749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5\": rpc error: code = NotFound desc = could not find container \"749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5\": container with ID starting with 749fa8f39149f299fb8a8ca199ba4326ecd2e49b0ed01219cb1d591802513fe5 not found: ID does not exist" Apr 22 18:08:54.554388 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:08:54.554350 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" path="/var/lib/kubelet/pods/3d038028-e486-4aca-9fc2-65042d3a1824/volumes" Apr 22 18:10:42.626294 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:42.626209 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz"] Apr 22 18:10:42.626834 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:42.626524 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="main" containerID="cri-o://6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504" gracePeriod=30 Apr 22 18:10:42.626834 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:42.626578 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="tokenizer" containerID="cri-o://249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e" gracePeriod=30 Apr 22 18:10:43.390845 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.390758 2564 generic.go:358] "Generic (PLEG): container finished" podID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerID="6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504" exitCode=0 Apr 22 18:10:43.390985 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.390836 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerDied","Data":"6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504"} Apr 22 18:10:43.776933 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.776911 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:10:43.821812 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.821790 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4pl\" (UniqueName: \"kubernetes.io/projected/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kube-api-access-sg4pl\") pod \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " Apr 22 18:10:43.821944 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.821823 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-uds\") pod \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " Apr 22 18:10:43.821944 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.821867 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-tmp\") pod \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " Apr 22 18:10:43.821944 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.821934 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kserve-provision-location\") pod \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " Apr 22 18:10:43.822120 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.821958 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tls-certs\") pod \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " Apr 22 18:10:43.822120 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.821977 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-cache\") pod \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\" (UID: \"525b62df-1e2c-46d3-a40c-ff5c0748a4c2\") " Apr 22 18:10:43.822222 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.822155 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "525b62df-1e2c-46d3-a40c-ff5c0748a4c2" (UID: "525b62df-1e2c-46d3-a40c-ff5c0748a4c2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:10:43.822318 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.822261 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "525b62df-1e2c-46d3-a40c-ff5c0748a4c2" (UID: "525b62df-1e2c-46d3-a40c-ff5c0748a4c2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:10:43.822436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.822331 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "525b62df-1e2c-46d3-a40c-ff5c0748a4c2" (UID: "525b62df-1e2c-46d3-a40c-ff5c0748a4c2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:10:43.822695 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.822649 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "525b62df-1e2c-46d3-a40c-ff5c0748a4c2" (UID: "525b62df-1e2c-46d3-a40c-ff5c0748a4c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:10:43.824103 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.824076 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kube-api-access-sg4pl" (OuterVolumeSpecName: "kube-api-access-sg4pl") pod "525b62df-1e2c-46d3-a40c-ff5c0748a4c2" (UID: "525b62df-1e2c-46d3-a40c-ff5c0748a4c2"). InnerVolumeSpecName "kube-api-access-sg4pl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:10:43.824190 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.824143 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "525b62df-1e2c-46d3-a40c-ff5c0748a4c2" (UID: "525b62df-1e2c-46d3-a40c-ff5c0748a4c2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:10:43.923215 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.923167 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:10:43.923215 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.923210 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:10:43.923215 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.923221 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:10:43.923215 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.923232 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sg4pl\" (UniqueName: \"kubernetes.io/projected/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-kube-api-access-sg4pl\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:10:43.923485 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.923241 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:10:43.923485 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:43.923251 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/525b62df-1e2c-46d3-a40c-ff5c0748a4c2-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:10:44.396693 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.396635 2564 generic.go:358] "Generic (PLEG): container finished" podID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerID="249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e" exitCode=0 Apr 22 18:10:44.396876 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.396698 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerDied","Data":"249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e"} Apr 22 18:10:44.396876 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.396725 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" Apr 22 18:10:44.396876 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.396736 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz" event={"ID":"525b62df-1e2c-46d3-a40c-ff5c0748a4c2","Type":"ContainerDied","Data":"f7212625451512db42fa08558dd1c76f2b5b15bb03acac301c5982c4231621e7"} Apr 22 18:10:44.396876 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.396757 2564 scope.go:117] "RemoveContainer" containerID="249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e" Apr 22 18:10:44.405566 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.405548 2564 scope.go:117] "RemoveContainer" containerID="6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504" Apr 22 18:10:44.412801 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.412785 2564 scope.go:117] "RemoveContainer" containerID="97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a" Apr 22 18:10:44.419591 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.419567 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz"] Apr 22 18:10:44.421519 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.421505 2564 scope.go:117] "RemoveContainer" containerID="249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e" Apr 22 18:10:44.421841 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:10:44.421809 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e\": container with ID starting with 249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e not found: ID does not exist" containerID="249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e" Apr 22 18:10:44.421939 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.421845 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e"} err="failed to get container status \"249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e\": rpc error: code = NotFound desc = could not find container \"249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e\": container with ID starting with 249fb1ff21c2051c5821c100cb6a888f6680139bf647e7f11eaeaeaaba8b5b7e not found: ID does not exist" Apr 22 18:10:44.421939 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.421870 2564 scope.go:117] "RemoveContainer" containerID="6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504" Apr 22 18:10:44.422257 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:10:44.422147 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504\": container with ID starting with 6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504 not found: ID does not exist" containerID="6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504" Apr 22 18:10:44.422257 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.422176 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504"} err="failed to get container status \"6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504\": rpc error: code = NotFound desc = could not find container \"6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504\": container with ID starting with 6f58afe33c00bd81ed80e8c19d4c740c4c3e2af1d92810ab1a29f5622820a504 not found: ID does not exist" Apr 22 18:10:44.422257 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.422198 2564 scope.go:117] "RemoveContainer" containerID="97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a" Apr 22 18:10:44.422561 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:10:44.422537 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a\": container with ID starting with 97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a not found: ID does not exist" containerID="97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a" Apr 22 18:10:44.422657 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.422567 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a"} err="failed to get container status \"97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a\": rpc error: code = NotFound desc = could not find container \"97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a\": container with ID starting with 97b02c00835c78cd6f4a6c96cb3d73b2d607fa0e6449300272de5ab0156d826a not found: ID does not exist" Apr 22 18:10:44.424642 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.424621 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schezctdz"] Apr 22 18:10:44.553943 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:44.553908 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" path="/var/lib/kubelet/pods/525b62df-1e2c-46d3-a40c-ff5c0748a4c2/volumes" Apr 22 18:10:49.805333 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805301 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4"] Apr 22 18:10:49.805736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805685 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="main" Apr 22 18:10:49.805736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805701 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="main" Apr 22 18:10:49.805736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805714 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="main" Apr 22 18:10:49.805736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805721 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="main" Apr 22 18:10:49.805736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805729 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="storage-initializer" Apr 22 18:10:49.805736 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805735 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="storage-initializer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805744 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="tokenizer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805750 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="tokenizer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805755 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="storage-initializer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805761 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="storage-initializer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805776 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="tokenizer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805782 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="tokenizer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805841 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="main" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805855 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="tokenizer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805864 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d038028-e486-4aca-9fc2-65042d3a1824" containerName="tokenizer" Apr 22 18:10:49.805932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.805871 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="525b62df-1e2c-46d3-a40c-ff5c0748a4c2" containerName="main" Apr 22 18:10:49.809101 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.809078 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.811966 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.811944 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:10:49.812502 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.812485 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 18:10:49.812563 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.812517 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-926zv\"" Apr 22 18:10:49.818369 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.818351 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4"] Apr 22 18:10:49.869456 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.869424 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.869591 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.869463 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.869591 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.869557 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.869687 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.869594 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.869726 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.869687 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9nnt\" (UniqueName: \"kubernetes.io/projected/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kube-api-access-c9nnt\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.869726 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.869713 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.970684 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.970641 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9nnt\" (UniqueName: \"kubernetes.io/projected/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kube-api-access-c9nnt\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.970854 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.970706 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.970854 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.970733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.970854 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.970755 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.970854 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.970807 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.970854 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.970837 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.971212 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.971190 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.971280 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.971217 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.971280 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.971264 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.971350 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.971282 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.973179 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.973160 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:49.978409 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:49.978381 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9nnt\" (UniqueName: \"kubernetes.io/projected/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kube-api-access-c9nnt\") pod \"custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:50.120317 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:50.120228 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:50.249226 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:50.249199 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4"] Apr 22 18:10:50.250817 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:10:50.250792 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6cb5ced_34a9_4e87_8bad_27dc35a0150e.slice/crio-cad0e274f1740bdfba2f26831ed206200aa71aafd96eb030f59cbb9553468042 WatchSource:0}: Error finding container cad0e274f1740bdfba2f26831ed206200aa71aafd96eb030f59cbb9553468042: Status 404 returned error can't find the container with id cad0e274f1740bdfba2f26831ed206200aa71aafd96eb030f59cbb9553468042 Apr 22 18:10:50.422243 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:50.422206 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerStarted","Data":"45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586"} Apr 22 18:10:50.422401 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:50.422252 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerStarted","Data":"cad0e274f1740bdfba2f26831ed206200aa71aafd96eb030f59cbb9553468042"} Apr 22 18:10:51.426943 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:51.426909 2564 generic.go:358] "Generic (PLEG): container finished" podID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerID="45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586" exitCode=0 Apr 22 18:10:51.427425 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:51.426990 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerDied","Data":"45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586"} Apr 22 18:10:52.432849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:52.432809 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerStarted","Data":"018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3"} Apr 22 18:10:52.432849 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:52.432857 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerStarted","Data":"247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7"} Apr 22 18:10:52.433373 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:52.432893 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:10:52.452730 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:10:52.452663 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" podStartSLOduration=3.452649693 podStartE2EDuration="3.452649693s" podCreationTimestamp="2026-04-22 18:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:10:52.450980642 +0000 UTC m=+1072.509883820" watchObservedRunningTime="2026-04-22 18:10:52.452649693 +0000 UTC m=+1072.511552836" Apr 22 18:11:00.121260 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:11:00.121222 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:11:00.121260 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:11:00.121263 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:11:00.123938 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:11:00.123910 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:11:00.471526 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:11:00.471502 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:11:21.476226 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:11:21.476186 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:12:59.804739 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:12:59.804704 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4"] Apr 22 18:12:59.805484 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:12:59.805018 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="main" containerID="cri-o://247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7" gracePeriod=30 Apr 22 18:12:59.805484 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:12:59.805088 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="tokenizer" containerID="cri-o://018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3" gracePeriod=30 Apr 22 18:13:00.471779 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:00.471735 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.51:8082/healthz\": dial tcp 10.133.0.51:8082: connect: connection refused" Apr 22 18:13:00.547831 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:00.547800 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:13:00.550171 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:00.550147 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:13:00.917790 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:00.917751 2564 generic.go:358] "Generic (PLEG): container finished" podID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerID="247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7" exitCode=0 Apr 22 18:13:00.918103 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:00.917822 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerDied","Data":"247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7"} Apr 22 18:13:01.075636 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.075612 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:13:01.149315 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149293 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-cache\") pod \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " Apr 22 18:13:01.149315 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149321 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-tmp\") pod \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " Apr 22 18:13:01.149527 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149339 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9nnt\" (UniqueName: \"kubernetes.io/projected/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kube-api-access-c9nnt\") pod \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " Apr 22 18:13:01.149527 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149393 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tls-certs\") pod \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " Apr 22 18:13:01.149527 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149445 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-uds\") pod \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " Apr 22 18:13:01.149527 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149517 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kserve-provision-location\") pod \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\" (UID: \"f6cb5ced-34a9-4e87-8bad-27dc35a0150e\") " Apr 22 18:13:01.149774 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149577 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f6cb5ced-34a9-4e87-8bad-27dc35a0150e" (UID: "f6cb5ced-34a9-4e87-8bad-27dc35a0150e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:01.149774 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149637 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f6cb5ced-34a9-4e87-8bad-27dc35a0150e" (UID: "f6cb5ced-34a9-4e87-8bad-27dc35a0150e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:01.149869 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149784 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f6cb5ced-34a9-4e87-8bad-27dc35a0150e" (UID: "f6cb5ced-34a9-4e87-8bad-27dc35a0150e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:01.149869 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149805 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:13:01.149869 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.149826 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:13:01.150195 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.150176 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6cb5ced-34a9-4e87-8bad-27dc35a0150e" (UID: "f6cb5ced-34a9-4e87-8bad-27dc35a0150e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:01.151540 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.151509 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f6cb5ced-34a9-4e87-8bad-27dc35a0150e" (UID: "f6cb5ced-34a9-4e87-8bad-27dc35a0150e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:13:01.151540 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.151528 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kube-api-access-c9nnt" (OuterVolumeSpecName: "kube-api-access-c9nnt") pod "f6cb5ced-34a9-4e87-8bad-27dc35a0150e" (UID: "f6cb5ced-34a9-4e87-8bad-27dc35a0150e"). InnerVolumeSpecName "kube-api-access-c9nnt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:13:01.250242 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.250209 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:13:01.250242 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.250237 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9nnt\" (UniqueName: \"kubernetes.io/projected/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-kube-api-access-c9nnt\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:13:01.250420 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.250253 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:13:01.250420 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.250267 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6cb5ced-34a9-4e87-8bad-27dc35a0150e-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:13:01.924460 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.924427 2564 generic.go:358] "Generic (PLEG): container finished" podID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerID="018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3" exitCode=0 Apr 22 18:13:01.924991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.924521 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" Apr 22 18:13:01.924991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.924520 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerDied","Data":"018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3"} Apr 22 18:13:01.924991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.924565 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4" event={"ID":"f6cb5ced-34a9-4e87-8bad-27dc35a0150e","Type":"ContainerDied","Data":"cad0e274f1740bdfba2f26831ed206200aa71aafd96eb030f59cbb9553468042"} Apr 22 18:13:01.924991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.924581 2564 scope.go:117] "RemoveContainer" containerID="018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3" Apr 22 18:13:01.945103 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.945019 2564 scope.go:117] "RemoveContainer" containerID="247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7" Apr 22 18:13:01.963024 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.962991 2564 scope.go:117] "RemoveContainer" containerID="45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586" Apr 22 18:13:01.968858 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.968834 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4"] Apr 22 18:13:01.972687 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.972645 2564 scope.go:117] "RemoveContainer" containerID="018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3" Apr 22 18:13:01.973079 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:13:01.973053 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3\": container with ID starting with 018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3 not found: ID does not exist" containerID="018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3" Apr 22 18:13:01.973173 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.973093 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3"} err="failed to get container status \"018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3\": rpc error: code = NotFound desc = could not find container \"018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3\": container with ID starting with 018def816cce08f0ea5eaec539485e961021581da3b991e234d27ee5f69ae6b3 not found: ID does not exist" Apr 22 18:13:01.973173 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.973121 2564 scope.go:117] "RemoveContainer" containerID="247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7" Apr 22 18:13:01.973414 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:13:01.973394 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7\": container with ID starting with 247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7 not found: ID does not exist" containerID="247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7" Apr 22 18:13:01.973490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.973429 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7"} err="failed to get container status \"247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7\": rpc error: code = NotFound desc = could not find container \"247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7\": container with ID starting with 247244d25fca75bb203745f04fa38e88dc1175cb9bc1a79325a2e3d64b9b7fe7 not found: ID does not exist" Apr 22 18:13:01.973490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.973446 2564 scope.go:117] "RemoveContainer" containerID="45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586" Apr 22 18:13:01.973771 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:13:01.973749 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586\": container with ID starting with 45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586 not found: ID does not exist" containerID="45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586" Apr 22 18:13:01.973857 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.973778 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586"} err="failed to get container status \"45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586\": rpc error: code = NotFound desc = could not find container \"45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586\": container with ID starting with 45da0e635d130d567f2f2cd4bac6e0b1238a8ecde4dd38e436ef843e8179c586 not found: ID does not exist" Apr 22 18:13:01.975013 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:01.974995 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-76b8588bkfbx4"] Apr 22 18:13:02.557489 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:02.557444 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" path="/var/lib/kubelet/pods/f6cb5ced-34a9-4e87-8bad-27dc35a0150e/volumes" Apr 22 18:13:20.675518 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675487 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5"] Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675841 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="storage-initializer" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675852 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="storage-initializer" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675866 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="main" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675872 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="main" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675879 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="tokenizer" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675885 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="tokenizer" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675942 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="tokenizer" Apr 22 18:13:20.677490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.675949 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cb5ced-34a9-4e87-8bad-27dc35a0150e" containerName="main" Apr 22 18:13:20.678890 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.678864 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.682335 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.682297 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 18:13:20.682335 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.682298 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:13:20.682538 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.682298 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-n4c5x\"" Apr 22 18:13:20.698058 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.698034 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5"] Apr 22 18:13:20.811679 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.811639 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.811838 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.811692 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e39b96e-6f6d-44c5-a02c-56800519dcef-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.811838 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.811748 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.811838 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.811800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.811949 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.811853 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzrl4\" (UniqueName: \"kubernetes.io/projected/2e39b96e-6f6d-44c5-a02c-56800519dcef-kube-api-access-qzrl4\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.811949 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.811908 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e39b96e-6f6d-44c5-a02c-56800519dcef-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912554 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912346 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912554 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912554 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzrl4\" (UniqueName: \"kubernetes.io/projected/2e39b96e-6f6d-44c5-a02c-56800519dcef-kube-api-access-qzrl4\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912554 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912433 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912786 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912764 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912824 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912775 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912862 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912826 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.912897 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.912863 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.914835 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.914812 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e39b96e-6f6d-44c5-a02c-56800519dcef-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.919925 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.919905 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzrl4\" (UniqueName: \"kubernetes.io/projected/2e39b96e-6f6d-44c5-a02c-56800519dcef-kube-api-access-qzrl4\") pod \"router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:20.990547 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:20.990458 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:21.120032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:21.120010 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5"] Apr 22 18:13:21.122173 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:13:21.122144 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e39b96e_6f6d_44c5_a02c_56800519dcef.slice/crio-6ea5f6087d590eba7a85a3a0894843989a27510903eaa0c8e3de09ac4ba12d09 WatchSource:0}: Error finding container 6ea5f6087d590eba7a85a3a0894843989a27510903eaa0c8e3de09ac4ba12d09: Status 404 returned error can't find the container with id 6ea5f6087d590eba7a85a3a0894843989a27510903eaa0c8e3de09ac4ba12d09 Apr 22 18:13:21.123940 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:21.123923 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:13:21.998493 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:21.998464 2564 generic.go:358] "Generic (PLEG): container finished" podID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerID="4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756" exitCode=0 Apr 22 18:13:21.998845 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:21.998515 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerDied","Data":"4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756"} Apr 22 18:13:21.998845 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:21.998538 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerStarted","Data":"6ea5f6087d590eba7a85a3a0894843989a27510903eaa0c8e3de09ac4ba12d09"} Apr 22 18:13:23.004321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:23.004288 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerStarted","Data":"3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f"} Apr 22 18:13:23.004321 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:23.004326 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerStarted","Data":"ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c"} Apr 22 18:13:23.004841 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:23.004404 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:23.026254 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:23.026198 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" podStartSLOduration=3.026180878 podStartE2EDuration="3.026180878s" podCreationTimestamp="2026-04-22 18:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:13:23.02468978 +0000 UTC m=+1223.083592915" watchObservedRunningTime="2026-04-22 18:13:23.026180878 +0000 UTC m=+1223.085084021" Apr 22 18:13:30.991346 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:30.991313 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:30.991843 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:30.991361 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:30.994148 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:30.994128 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:31.036480 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:31.036451 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:13:53.044449 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:13:53.044422 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:14:00.918979 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:00.918896 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5b589d76d4-pz7p6"] Apr 22 18:14:00.919357 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:00.919174 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" containerName="manager" containerID="cri-o://c8a5f3404e510e660a81c5df14a78963845249cb4e8b00818445c36956ccabb5" gracePeriod=30 Apr 22 18:14:01.152884 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.152857 2564 generic.go:358] "Generic (PLEG): container finished" podID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" containerID="c8a5f3404e510e660a81c5df14a78963845249cb4e8b00818445c36956ccabb5" exitCode=0 Apr 22 18:14:01.153012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.152940 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" event={"ID":"aec381df-ac02-4e32-bc98-84c3cc7a9da4","Type":"ContainerDied","Data":"c8a5f3404e510e660a81c5df14a78963845249cb4e8b00818445c36956ccabb5"} Apr 22 18:14:01.153012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.152987 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" event={"ID":"aec381df-ac02-4e32-bc98-84c3cc7a9da4","Type":"ContainerDied","Data":"39e32327cdf8e4c1d48edceb1f51b5bb68a320a3fe09f26178a3173ec02e8b64"} Apr 22 18:14:01.153012 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.153001 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e32327cdf8e4c1d48edceb1f51b5bb68a320a3fe09f26178a3173ec02e8b64" Apr 22 18:14:01.165813 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.165794 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:14:01.225766 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.225734 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfmv4\" (UniqueName: \"kubernetes.io/projected/aec381df-ac02-4e32-bc98-84c3cc7a9da4-kube-api-access-zfmv4\") pod \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " Apr 22 18:14:01.225766 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.225770 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") pod \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\" (UID: \"aec381df-ac02-4e32-bc98-84c3cc7a9da4\") " Apr 22 18:14:01.227874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.227839 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec381df-ac02-4e32-bc98-84c3cc7a9da4-kube-api-access-zfmv4" (OuterVolumeSpecName: "kube-api-access-zfmv4") pod "aec381df-ac02-4e32-bc98-84c3cc7a9da4" (UID: "aec381df-ac02-4e32-bc98-84c3cc7a9da4"). InnerVolumeSpecName "kube-api-access-zfmv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:14:01.227874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.227850 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert" (OuterVolumeSpecName: "cert") pod "aec381df-ac02-4e32-bc98-84c3cc7a9da4" (UID: "aec381df-ac02-4e32-bc98-84c3cc7a9da4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:14:01.326734 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.326692 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zfmv4\" (UniqueName: \"kubernetes.io/projected/aec381df-ac02-4e32-bc98-84c3cc7a9da4-kube-api-access-zfmv4\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.326734 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:01.326733 2564 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec381df-ac02-4e32-bc98-84c3cc7a9da4-cert\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:14:02.157157 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:02.157120 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5b589d76d4-pz7p6" Apr 22 18:14:02.178758 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:02.178733 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5b589d76d4-pz7p6"] Apr 22 18:14:02.183262 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:02.183237 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-5b589d76d4-pz7p6"] Apr 22 18:14:02.553327 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:14:02.553297 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" path="/var/lib/kubelet/pods/aec381df-ac02-4e32-bc98-84c3cc7a9da4/volumes" Apr 22 18:15:00.611425 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:00.611397 2564 scope.go:117] "RemoveContainer" containerID="c8a5f3404e510e660a81c5df14a78963845249cb4e8b00818445c36956ccabb5" Apr 22 18:15:09.192557 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:09.192514 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5"] Apr 22 18:15:09.193049 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:09.192941 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="main" containerID="cri-o://ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c" gracePeriod=30 Apr 22 18:15:09.193049 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:09.193016 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="tokenizer" containerID="cri-o://3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f" gracePeriod=30 Apr 22 18:15:09.405265 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:09.405232 2564 generic.go:358] "Generic (PLEG): container finished" podID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerID="ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c" exitCode=0 Apr 22 18:15:09.405445 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:09.405318 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerDied","Data":"ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c"} Apr 22 18:15:10.340135 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.340113 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:15:10.411052 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.411021 2564 generic.go:358] "Generic (PLEG): container finished" podID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerID="3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f" exitCode=0 Apr 22 18:15:10.411187 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.411100 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" Apr 22 18:15:10.411187 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.411106 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerDied","Data":"3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f"} Apr 22 18:15:10.411187 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.411140 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5" event={"ID":"2e39b96e-6f6d-44c5-a02c-56800519dcef","Type":"ContainerDied","Data":"6ea5f6087d590eba7a85a3a0894843989a27510903eaa0c8e3de09ac4ba12d09"} Apr 22 18:15:10.411187 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.411157 2564 scope.go:117] "RemoveContainer" containerID="3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f" Apr 22 18:15:10.419278 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.419262 2564 scope.go:117] "RemoveContainer" containerID="ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c" Apr 22 18:15:10.426337 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.426319 2564 scope.go:117] "RemoveContainer" containerID="4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756" Apr 22 18:15:10.433212 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.433197 2564 scope.go:117] "RemoveContainer" containerID="3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f" Apr 22 18:15:10.433447 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:15:10.433429 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f\": container with ID starting with 3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f not found: ID does not exist" containerID="3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f" Apr 22 18:15:10.433490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.433457 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f"} err="failed to get container status \"3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f\": rpc error: code = NotFound desc = could not find container \"3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f\": container with ID starting with 3de860d1127f3f8f4ae1ce1e034b5bd17ba358ccb20835939bd1bd8769cb808f not found: ID does not exist" Apr 22 18:15:10.433490 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.433474 2564 scope.go:117] "RemoveContainer" containerID="ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c" Apr 22 18:15:10.433780 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:15:10.433763 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c\": container with ID starting with ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c not found: ID does not exist" containerID="ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c" Apr 22 18:15:10.433831 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.433787 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c"} err="failed to get container status \"ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c\": rpc error: code = NotFound desc = could not find container \"ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c\": container with ID starting with ffc6259af2d45f7745c8f3c52ecfe75da50c95672e18faf73471b9ae8a1ea50c not found: ID does not exist" Apr 22 18:15:10.433831 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.433802 2564 scope.go:117] "RemoveContainer" containerID="4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756" Apr 22 18:15:10.434038 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:15:10.434019 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756\": container with ID starting with 4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756 not found: ID does not exist" containerID="4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756" Apr 22 18:15:10.434091 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.434046 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756"} err="failed to get container status \"4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756\": rpc error: code = NotFound desc = could not find container \"4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756\": container with ID starting with 4b824ef1e8ceac34b3f79807197eb4db56e7af12ccc3c4c51737ed24a7a20756 not found: ID does not exist" Apr 22 18:15:10.482042 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482019 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-cache\") pod \"2e39b96e-6f6d-44c5-a02c-56800519dcef\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " Apr 22 18:15:10.482170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482068 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzrl4\" (UniqueName: \"kubernetes.io/projected/2e39b96e-6f6d-44c5-a02c-56800519dcef-kube-api-access-qzrl4\") pod \"2e39b96e-6f6d-44c5-a02c-56800519dcef\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " Apr 22 18:15:10.482170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482106 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e39b96e-6f6d-44c5-a02c-56800519dcef-tls-certs\") pod \"2e39b96e-6f6d-44c5-a02c-56800519dcef\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " Apr 22 18:15:10.482170 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482131 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-uds\") pod \"2e39b96e-6f6d-44c5-a02c-56800519dcef\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " Apr 22 18:15:10.482345 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482186 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-tmp\") pod \"2e39b96e-6f6d-44c5-a02c-56800519dcef\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " Apr 22 18:15:10.482345 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482238 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-kserve-provision-location\") pod \"2e39b96e-6f6d-44c5-a02c-56800519dcef\" (UID: \"2e39b96e-6f6d-44c5-a02c-56800519dcef\") " Apr 22 18:15:10.482345 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482240 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2e39b96e-6f6d-44c5-a02c-56800519dcef" (UID: "2e39b96e-6f6d-44c5-a02c-56800519dcef"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:10.482489 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482468 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2e39b96e-6f6d-44c5-a02c-56800519dcef" (UID: "2e39b96e-6f6d-44c5-a02c-56800519dcef"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:10.482561 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482534 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:15:10.482612 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.482551 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2e39b96e-6f6d-44c5-a02c-56800519dcef" (UID: "2e39b96e-6f6d-44c5-a02c-56800519dcef"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:10.483096 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.483071 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e39b96e-6f6d-44c5-a02c-56800519dcef" (UID: "2e39b96e-6f6d-44c5-a02c-56800519dcef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:10.490605 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.484304 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e39b96e-6f6d-44c5-a02c-56800519dcef-kube-api-access-qzrl4" (OuterVolumeSpecName: "kube-api-access-qzrl4") pod "2e39b96e-6f6d-44c5-a02c-56800519dcef" (UID: "2e39b96e-6f6d-44c5-a02c-56800519dcef"). InnerVolumeSpecName "kube-api-access-qzrl4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:15:10.490605 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.484339 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e39b96e-6f6d-44c5-a02c-56800519dcef-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2e39b96e-6f6d-44c5-a02c-56800519dcef" (UID: "2e39b96e-6f6d-44c5-a02c-56800519dcef"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:15:10.583294 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.583261 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:15:10.583294 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.583292 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzrl4\" (UniqueName: \"kubernetes.io/projected/2e39b96e-6f6d-44c5-a02c-56800519dcef-kube-api-access-qzrl4\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:15:10.583420 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.583302 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e39b96e-6f6d-44c5-a02c-56800519dcef-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:15:10.583420 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.583311 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:15:10.583420 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.583322 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e39b96e-6f6d-44c5-a02c-56800519dcef-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:15:10.731930 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.731861 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5"] Apr 22 18:15:10.733991 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:10.733972 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69b4664558-kz2p5"] Apr 22 18:15:12.554143 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:12.554110 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" path="/var/lib/kubelet/pods/2e39b96e-6f6d-44c5-a02c-56800519dcef/volumes" Apr 22 18:15:27.189998 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.189965 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq"] Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190340 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="main" Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190352 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="main" Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190361 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" containerName="manager" Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190368 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" containerName="manager" Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190378 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="storage-initializer" Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190386 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="storage-initializer" Apr 22 18:15:27.190390 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190393 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="tokenizer" Apr 22 18:15:27.190633 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190398 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="tokenizer" Apr 22 18:15:27.190633 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190458 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="main" Apr 22 18:15:27.190633 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190468 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e39b96e-6f6d-44c5-a02c-56800519dcef" containerName="tokenizer" Apr 22 18:15:27.190633 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.190475 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="aec381df-ac02-4e32-bc98-84c3cc7a9da4" containerName="manager" Apr 22 18:15:27.193696 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.193660 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.197494 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.197473 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-pqdr7\"" Apr 22 18:15:27.197638 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.197534 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:15:27.197638 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.197546 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 18:15:27.204426 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.204406 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq"] Apr 22 18:15:27.322819 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.322778 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.322819 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.322832 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d02ec8fe-7398-40a1-8689-4431965cf264-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.323080 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.322942 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.323080 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.322977 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.323080 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.323045 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.323080 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.323073 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4lmw\" (UniqueName: \"kubernetes.io/projected/d02ec8fe-7398-40a1-8689-4431965cf264-kube-api-access-n4lmw\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424270 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424270 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424487 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424304 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424487 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424442 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4lmw\" (UniqueName: \"kubernetes.io/projected/d02ec8fe-7398-40a1-8689-4431965cf264-kube-api-access-n4lmw\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424588 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424559 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424648 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424600 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d02ec8fe-7398-40a1-8689-4431965cf264-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424727 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424658 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424727 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424706 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424831 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424748 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.424898 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.424876 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.427219 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.427198 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d02ec8fe-7398-40a1-8689-4431965cf264-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.437102 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.437074 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4lmw\" (UniqueName: \"kubernetes.io/projected/d02ec8fe-7398-40a1-8689-4431965cf264-kube-api-access-n4lmw\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.503746 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.503695 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:27.841007 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:27.840970 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq"] Apr 22 18:15:27.842891 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:15:27.842864 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd02ec8fe_7398_40a1_8689_4431965cf264.slice/crio-23f7cb45edd8c9cf621a0de8b91963c880b000c163bb0099b3faa0a797187724 WatchSource:0}: Error finding container 23f7cb45edd8c9cf621a0de8b91963c880b000c163bb0099b3faa0a797187724: Status 404 returned error can't find the container with id 23f7cb45edd8c9cf621a0de8b91963c880b000c163bb0099b3faa0a797187724 Apr 22 18:15:28.480368 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:28.480334 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerStarted","Data":"2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1"} Apr 22 18:15:28.480368 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:28.480373 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerStarted","Data":"23f7cb45edd8c9cf621a0de8b91963c880b000c163bb0099b3faa0a797187724"} Apr 22 18:15:29.485010 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:29.484965 2564 generic.go:358] "Generic (PLEG): container finished" podID="d02ec8fe-7398-40a1-8689-4431965cf264" containerID="2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1" exitCode=0 Apr 22 18:15:29.485496 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:29.485061 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerDied","Data":"2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1"} Apr 22 18:15:30.490869 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:30.490834 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerStarted","Data":"246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288"} Apr 22 18:15:30.490869 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:30.490870 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerStarted","Data":"08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090"} Apr 22 18:15:30.491279 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:30.491037 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:30.516750 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:30.516698 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" podStartSLOduration=3.516685157 podStartE2EDuration="3.516685157s" podCreationTimestamp="2026-04-22 18:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:15:30.513868089 +0000 UTC m=+1350.572771287" watchObservedRunningTime="2026-04-22 18:15:30.516685157 +0000 UTC m=+1350.575588295" Apr 22 18:15:37.504511 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:37.504475 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:37.505017 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:37.504557 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:37.507025 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:37.507004 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:37.517050 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:37.517034 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:15:59.532038 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:15:59.532007 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:18:00.575897 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:00.575868 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:18:00.580822 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:00.580801 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:18:14.974408 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:14.974312 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq"] Apr 22 18:18:14.974961 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:14.974752 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="main" containerID="cri-o://08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090" gracePeriod=30 Apr 22 18:18:14.974961 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:14.974820 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="tokenizer" containerID="cri-o://246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288" gracePeriod=30 Apr 22 18:18:15.118877 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:15.118846 2564 generic.go:358] "Generic (PLEG): container finished" podID="d02ec8fe-7398-40a1-8689-4431965cf264" containerID="08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090" exitCode=0 Apr 22 18:18:15.119036 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:15.118901 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerDied","Data":"08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090"} Apr 22 18:18:16.328539 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.328511 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:18:16.427430 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427383 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-cache\") pod \"d02ec8fe-7398-40a1-8689-4431965cf264\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " Apr 22 18:18:16.427430 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427444 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-uds\") pod \"d02ec8fe-7398-40a1-8689-4431965cf264\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " Apr 22 18:18:16.427731 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427462 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-tmp\") pod \"d02ec8fe-7398-40a1-8689-4431965cf264\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " Apr 22 18:18:16.427731 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427498 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-kserve-provision-location\") pod \"d02ec8fe-7398-40a1-8689-4431965cf264\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " Apr 22 18:18:16.427731 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427553 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d02ec8fe-7398-40a1-8689-4431965cf264-tls-certs\") pod \"d02ec8fe-7398-40a1-8689-4431965cf264\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " Apr 22 18:18:16.427731 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427574 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4lmw\" (UniqueName: \"kubernetes.io/projected/d02ec8fe-7398-40a1-8689-4431965cf264-kube-api-access-n4lmw\") pod \"d02ec8fe-7398-40a1-8689-4431965cf264\" (UID: \"d02ec8fe-7398-40a1-8689-4431965cf264\") " Apr 22 18:18:16.427731 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427687 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d02ec8fe-7398-40a1-8689-4431965cf264" (UID: "d02ec8fe-7398-40a1-8689-4431965cf264"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:16.427731 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427705 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d02ec8fe-7398-40a1-8689-4431965cf264" (UID: "d02ec8fe-7398-40a1-8689-4431965cf264"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:16.428069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427805 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d02ec8fe-7398-40a1-8689-4431965cf264" (UID: "d02ec8fe-7398-40a1-8689-4431965cf264"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:16.428069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427913 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:18:16.428069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427934 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:18:16.428069 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.427951 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:18:16.428201 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.428187 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d02ec8fe-7398-40a1-8689-4431965cf264" (UID: "d02ec8fe-7398-40a1-8689-4431965cf264"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:16.429732 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.429708 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02ec8fe-7398-40a1-8689-4431965cf264-kube-api-access-n4lmw" (OuterVolumeSpecName: "kube-api-access-n4lmw") pod "d02ec8fe-7398-40a1-8689-4431965cf264" (UID: "d02ec8fe-7398-40a1-8689-4431965cf264"). InnerVolumeSpecName "kube-api-access-n4lmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:18:16.429794 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.429724 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02ec8fe-7398-40a1-8689-4431965cf264-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d02ec8fe-7398-40a1-8689-4431965cf264" (UID: "d02ec8fe-7398-40a1-8689-4431965cf264"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:18:16.529151 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.529111 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d02ec8fe-7398-40a1-8689-4431965cf264-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:18:16.529151 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.529144 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d02ec8fe-7398-40a1-8689-4431965cf264-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:18:16.529151 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:16.529154 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4lmw\" (UniqueName: \"kubernetes.io/projected/d02ec8fe-7398-40a1-8689-4431965cf264-kube-api-access-n4lmw\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:18:17.128718 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.128658 2564 generic.go:358] "Generic (PLEG): container finished" podID="d02ec8fe-7398-40a1-8689-4431965cf264" containerID="246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288" exitCode=0 Apr 22 18:18:17.128919 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.128726 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerDied","Data":"246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288"} Apr 22 18:18:17.128919 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.128751 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" Apr 22 18:18:17.128919 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.128769 2564 scope.go:117] "RemoveContainer" containerID="246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288" Apr 22 18:18:17.128919 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.128759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq" event={"ID":"d02ec8fe-7398-40a1-8689-4431965cf264","Type":"ContainerDied","Data":"23f7cb45edd8c9cf621a0de8b91963c880b000c163bb0099b3faa0a797187724"} Apr 22 18:18:17.137375 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.137356 2564 scope.go:117] "RemoveContainer" containerID="08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090" Apr 22 18:18:17.145123 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.145103 2564 scope.go:117] "RemoveContainer" containerID="2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1" Apr 22 18:18:17.151622 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.151589 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq"] Apr 22 18:18:17.154749 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.154730 2564 scope.go:117] "RemoveContainer" containerID="246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288" Apr 22 18:18:17.155029 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:18:17.155010 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288\": container with ID starting with 246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288 not found: ID does not exist" containerID="246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288" Apr 22 18:18:17.155104 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.155044 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288"} err="failed to get container status \"246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288\": rpc error: code = NotFound desc = could not find container \"246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288\": container with ID starting with 246d13a90c00cef3070abb34ac9f71fc14622e4fd40ae72f686c4ff3b8e7d288 not found: ID does not exist" Apr 22 18:18:17.155104 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.155072 2564 scope.go:117] "RemoveContainer" containerID="08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090" Apr 22 18:18:17.155309 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:18:17.155292 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090\": container with ID starting with 08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090 not found: ID does not exist" containerID="08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090" Apr 22 18:18:17.155349 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.155307 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheqk9fq"] Apr 22 18:18:17.155349 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.155315 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090"} err="failed to get container status \"08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090\": rpc error: code = NotFound desc = could not find container \"08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090\": container with ID starting with 08e6f96a94b3e9b6362274cc48442357a6955ffd234f63f5d067afd41d8c0090 not found: ID does not exist" Apr 22 18:18:17.155349 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.155331 2564 scope.go:117] "RemoveContainer" containerID="2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1" Apr 22 18:18:17.155567 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:18:17.155551 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1\": container with ID starting with 2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1 not found: ID does not exist" containerID="2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1" Apr 22 18:18:17.155610 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:17.155571 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1"} err="failed to get container status \"2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1\": rpc error: code = NotFound desc = could not find container \"2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1\": container with ID starting with 2fc2964b65a6e25844e5b36cbd4ce794de6f573c1f4e2fd11825f1c330f5e0e1 not found: ID does not exist" Apr 22 18:18:18.553969 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:18.553938 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" path="/var/lib/kubelet/pods/d02ec8fe-7398-40a1-8689-4431965cf264/volumes" Apr 22 18:18:31.871289 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871259 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8"] Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871680 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="storage-initializer" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871694 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="storage-initializer" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871712 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="main" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871717 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="main" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871723 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="tokenizer" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871728 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="tokenizer" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871793 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="tokenizer" Apr 22 18:18:31.873810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.871804 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d02ec8fe-7398-40a1-8689-4431965cf264" containerName="main" Apr 22 18:18:31.874866 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.874846 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:31.878384 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.878363 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:18:31.878503 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.878429 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 18:18:31.878503 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.878431 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-xxmd4\"" Apr 22 18:18:31.884436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.884411 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8"] Apr 22 18:18:31.965018 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.964977 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:31.965018 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.965023 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:31.965273 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.965121 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43b209a-c264-46e2-ac59-3ebad3374032-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:31.965273 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.965169 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:31.965273 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.965220 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:31.965273 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:31.965259 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2bxd\" (UniqueName: \"kubernetes.io/projected/f43b209a-c264-46e2-ac59-3ebad3374032-kube-api-access-h2bxd\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066330 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066286 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066523 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066347 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066523 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066416 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43b209a-c264-46e2-ac59-3ebad3374032-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066648 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066573 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066732 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066732 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066724 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2bxd\" (UniqueName: \"kubernetes.io/projected/f43b209a-c264-46e2-ac59-3ebad3374032-kube-api-access-h2bxd\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066841 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066780 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.066899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.066839 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.067030 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.067008 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.067148 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.067124 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.069451 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.069432 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43b209a-c264-46e2-ac59-3ebad3374032-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.073977 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.073958 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2bxd\" (UniqueName: \"kubernetes.io/projected/f43b209a-c264-46e2-ac59-3ebad3374032-kube-api-access-h2bxd\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.185441 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.185404 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:32.317165 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.317137 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8"] Apr 22 18:18:32.318378 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:18:32.318342 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43b209a_c264_46e2_ac59_3ebad3374032.slice/crio-94ba9bc31864336182e08900143a6cce0dff9cb97239d88dc9fc4ae02478f911 WatchSource:0}: Error finding container 94ba9bc31864336182e08900143a6cce0dff9cb97239d88dc9fc4ae02478f911: Status 404 returned error can't find the container with id 94ba9bc31864336182e08900143a6cce0dff9cb97239d88dc9fc4ae02478f911 Apr 22 18:18:32.320350 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:32.320331 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:18:33.194220 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:33.194179 2564 generic.go:358] "Generic (PLEG): container finished" podID="f43b209a-c264-46e2-ac59-3ebad3374032" containerID="ee5e873b534b0503c36a6761f50d30654aa95db0981b5b5335ec9970c73207e3" exitCode=0 Apr 22 18:18:33.194603 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:33.194263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerDied","Data":"ee5e873b534b0503c36a6761f50d30654aa95db0981b5b5335ec9970c73207e3"} Apr 22 18:18:33.194603 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:33.194303 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerStarted","Data":"94ba9bc31864336182e08900143a6cce0dff9cb97239d88dc9fc4ae02478f911"} Apr 22 18:18:34.200738 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:34.200699 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerStarted","Data":"325f782abf9bdcbf01eefd733ae9fca101aba1e8947dd7cd87589e79cab3c8d1"} Apr 22 18:18:34.200738 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:34.200736 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerStarted","Data":"b0f5dd89817e97e4a5fce64064341ae3b2b1ff755a51ed3b6d16d33d66cd3900"} Apr 22 18:18:34.201308 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:34.200805 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:34.226941 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:34.226894 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" podStartSLOduration=3.226879185 podStartE2EDuration="3.226879185s" podCreationTimestamp="2026-04-22 18:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:18:34.22334691 +0000 UTC m=+1534.282250080" watchObservedRunningTime="2026-04-22 18:18:34.226879185 +0000 UTC m=+1534.285782334" Apr 22 18:18:42.186095 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:42.186059 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:42.186095 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:42.186103 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:42.188715 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:42.188691 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:18:42.236126 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:18:42.236100 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:19:01.025224 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.025187 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb"] Apr 22 18:19:01.029997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.029972 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.032850 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.032825 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 18:19:01.032989 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.032877 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-7rp6q\"" Apr 22 18:19:01.048284 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.048259 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb"] Apr 22 18:19:01.122032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.121995 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.122032 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.122033 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr4p9\" (UniqueName: \"kubernetes.io/projected/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kube-api-access-dr4p9\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.122249 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.122058 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.122249 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.122116 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.122249 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.122168 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.122249 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.122186 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.222826 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.222789 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223020 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.222854 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223020 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.222881 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223020 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.222935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223260 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.223068 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr4p9\" (UniqueName: \"kubernetes.io/projected/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kube-api-access-dr4p9\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223260 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.223119 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223371 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.223298 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223371 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.223316 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223371 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.223342 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.223517 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.223499 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.225502 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.225481 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.243604 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.243573 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr4p9\" (UniqueName: \"kubernetes.io/projected/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kube-api-access-dr4p9\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.340247 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.340156 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:01.481176 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:01.481144 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb"] Apr 22 18:19:01.482838 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:19:01.482772 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a1797e_7fd7_438e_9ad8_ea42f537a31c.slice/crio-8067cc5dc2f24830c4572f6fb0729efca8fcd3c77b72bac88d7df6467f94f925 WatchSource:0}: Error finding container 8067cc5dc2f24830c4572f6fb0729efca8fcd3c77b72bac88d7df6467f94f925: Status 404 returned error can't find the container with id 8067cc5dc2f24830c4572f6fb0729efca8fcd3c77b72bac88d7df6467f94f925 Apr 22 18:19:02.315766 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:02.315732 2564 generic.go:358] "Generic (PLEG): container finished" podID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerID="37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807" exitCode=0 Apr 22 18:19:02.316157 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:02.315808 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerDied","Data":"37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807"} Apr 22 18:19:02.316157 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:02.315838 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerStarted","Data":"8067cc5dc2f24830c4572f6fb0729efca8fcd3c77b72bac88d7df6467f94f925"} Apr 22 18:19:03.240119 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:03.240087 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:19:03.322551 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:03.322509 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerStarted","Data":"e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109"} Apr 22 18:19:03.322954 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:03.322557 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerStarted","Data":"d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d"} Apr 22 18:19:03.322954 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:03.322593 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:03.341835 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:03.341575 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" podStartSLOduration=2.341556923 podStartE2EDuration="2.341556923s" podCreationTimestamp="2026-04-22 18:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:19:03.340976143 +0000 UTC m=+1563.399879277" watchObservedRunningTime="2026-04-22 18:19:03.341556923 +0000 UTC m=+1563.400460066" Apr 22 18:19:11.341031 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:11.340997 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:11.341477 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:11.341041 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:11.343837 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:11.343816 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:11.354679 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:11.354644 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:32.359620 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:32.359587 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:33.431528 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:33.431487 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb"] Apr 22 18:19:33.432020 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:33.431907 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="main" containerID="cri-o://d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d" gracePeriod=30 Apr 22 18:19:33.432099 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:33.432028 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="tokenizer" containerID="cri-o://e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109" gracePeriod=30 Apr 22 18:19:34.442280 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.442246 2564 generic.go:358] "Generic (PLEG): container finished" podID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerID="d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d" exitCode=0 Apr 22 18:19:34.442654 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.442299 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerDied","Data":"d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d"} Apr 22 18:19:34.786989 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.786964 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:34.818340 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818309 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-cache\") pod \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " Apr 22 18:19:34.818528 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818383 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-uds\") pod \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " Apr 22 18:19:34.818528 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818420 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-tmp\") pod \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " Apr 22 18:19:34.818691 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818565 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tls-certs\") pod \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " Apr 22 18:19:34.818691 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818570 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "72a1797e-7fd7-438e-9ad8-ea42f537a31c" (UID: "72a1797e-7fd7-438e-9ad8-ea42f537a31c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:34.818691 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818592 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kserve-provision-location\") pod \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " Apr 22 18:19:34.818691 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818632 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr4p9\" (UniqueName: \"kubernetes.io/projected/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kube-api-access-dr4p9\") pod \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\" (UID: \"72a1797e-7fd7-438e-9ad8-ea42f537a31c\") " Apr 22 18:19:34.818691 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818678 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "72a1797e-7fd7-438e-9ad8-ea42f537a31c" (UID: "72a1797e-7fd7-438e-9ad8-ea42f537a31c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:34.818943 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818770 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "72a1797e-7fd7-438e-9ad8-ea42f537a31c" (UID: "72a1797e-7fd7-438e-9ad8-ea42f537a31c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:34.818985 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818943 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:19:34.818985 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818965 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:19:34.818985 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.818981 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:19:34.819323 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.819305 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72a1797e-7fd7-438e-9ad8-ea42f537a31c" (UID: "72a1797e-7fd7-438e-9ad8-ea42f537a31c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:34.820730 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.820702 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kube-api-access-dr4p9" (OuterVolumeSpecName: "kube-api-access-dr4p9") pod "72a1797e-7fd7-438e-9ad8-ea42f537a31c" (UID: "72a1797e-7fd7-438e-9ad8-ea42f537a31c"). InnerVolumeSpecName "kube-api-access-dr4p9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:19:34.821024 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.821006 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "72a1797e-7fd7-438e-9ad8-ea42f537a31c" (UID: "72a1797e-7fd7-438e-9ad8-ea42f537a31c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:19:34.919722 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.919652 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:19:34.919722 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.919719 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dr4p9\" (UniqueName: \"kubernetes.io/projected/72a1797e-7fd7-438e-9ad8-ea42f537a31c-kube-api-access-dr4p9\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:19:34.919930 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:34.919736 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72a1797e-7fd7-438e-9ad8-ea42f537a31c-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:19:35.448303 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.448270 2564 generic.go:358] "Generic (PLEG): container finished" podID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerID="e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109" exitCode=0 Apr 22 18:19:35.448743 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.448310 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerDied","Data":"e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109"} Apr 22 18:19:35.448743 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.448339 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" event={"ID":"72a1797e-7fd7-438e-9ad8-ea42f537a31c","Type":"ContainerDied","Data":"8067cc5dc2f24830c4572f6fb0729efca8fcd3c77b72bac88d7df6467f94f925"} Apr 22 18:19:35.448743 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.448344 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb" Apr 22 18:19:35.448743 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.448358 2564 scope.go:117] "RemoveContainer" containerID="e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109" Apr 22 18:19:35.458247 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.458225 2564 scope.go:117] "RemoveContainer" containerID="d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d" Apr 22 18:19:35.466356 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.466334 2564 scope.go:117] "RemoveContainer" containerID="37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807" Apr 22 18:19:35.472787 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.472761 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb"] Apr 22 18:19:35.476016 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.475993 2564 scope.go:117] "RemoveContainer" containerID="e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109" Apr 22 18:19:35.476281 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:19:35.476263 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109\": container with ID starting with e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109 not found: ID does not exist" containerID="e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109" Apr 22 18:19:35.476368 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.476288 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109"} err="failed to get container status \"e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109\": rpc error: code = NotFound desc = could not find container \"e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109\": container with ID starting with e33441d1236750ecc3d8bf342aa035f0d01f10786990fb8a201cfa9013d7b109 not found: ID does not exist" Apr 22 18:19:35.476368 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.476306 2564 scope.go:117] "RemoveContainer" containerID="d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d" Apr 22 18:19:35.476499 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:19:35.476483 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d\": container with ID starting with d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d not found: ID does not exist" containerID="d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d" Apr 22 18:19:35.476552 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.476503 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d"} err="failed to get container status \"d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d\": rpc error: code = NotFound desc = could not find container \"d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d\": container with ID starting with d1cf571f45397ae5c030d6df5922bf6ac6c4a97301ccc8cf062c215d11cb5d4d not found: ID does not exist" Apr 22 18:19:35.476552 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.476517 2564 scope.go:117] "RemoveContainer" containerID="37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807" Apr 22 18:19:35.476777 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:19:35.476755 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807\": container with ID starting with 37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807 not found: ID does not exist" containerID="37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807" Apr 22 18:19:35.476866 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.476777 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807"} err="failed to get container status \"37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807\": rpc error: code = NotFound desc = could not find container \"37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807\": container with ID starting with 37df59200e1ba110d63f24f8b44728740f9038a6f08006b4e60b443bd7328807 not found: ID does not exist" Apr 22 18:19:35.477061 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:35.477043 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7ff9ft4jzb"] Apr 22 18:19:36.553806 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:19:36.553767 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" path="/var/lib/kubelet/pods/72a1797e-7fd7-438e-9ad8-ea42f537a31c/volumes" Apr 22 18:21:14.571590 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:14.571554 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8"] Apr 22 18:21:14.572142 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:14.571892 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="main" containerID="cri-o://b0f5dd89817e97e4a5fce64064341ae3b2b1ff755a51ed3b6d16d33d66cd3900" gracePeriod=30 Apr 22 18:21:14.572142 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:14.571943 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="tokenizer" containerID="cri-o://325f782abf9bdcbf01eefd733ae9fca101aba1e8947dd7cd87589e79cab3c8d1" gracePeriod=30 Apr 22 18:21:14.811105 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:14.811071 2564 generic.go:358] "Generic (PLEG): container finished" podID="f43b209a-c264-46e2-ac59-3ebad3374032" containerID="b0f5dd89817e97e4a5fce64064341ae3b2b1ff755a51ed3b6d16d33d66cd3900" exitCode=0 Apr 22 18:21:14.811274 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:14.811148 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerDied","Data":"b0f5dd89817e97e4a5fce64064341ae3b2b1ff755a51ed3b6d16d33d66cd3900"} Apr 22 18:21:15.818403 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.818375 2564 generic.go:358] "Generic (PLEG): container finished" podID="f43b209a-c264-46e2-ac59-3ebad3374032" containerID="325f782abf9bdcbf01eefd733ae9fca101aba1e8947dd7cd87589e79cab3c8d1" exitCode=0 Apr 22 18:21:15.818733 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.818448 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerDied","Data":"325f782abf9bdcbf01eefd733ae9fca101aba1e8947dd7cd87589e79cab3c8d1"} Apr 22 18:21:15.818733 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.818493 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" event={"ID":"f43b209a-c264-46e2-ac59-3ebad3374032","Type":"ContainerDied","Data":"94ba9bc31864336182e08900143a6cce0dff9cb97239d88dc9fc4ae02478f911"} Apr 22 18:21:15.818733 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.818509 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94ba9bc31864336182e08900143a6cce0dff9cb97239d88dc9fc4ae02478f911" Apr 22 18:21:15.829625 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.829608 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:21:15.915624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915538 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-uds\") pod \"f43b209a-c264-46e2-ac59-3ebad3374032\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " Apr 22 18:21:15.915624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915578 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2bxd\" (UniqueName: \"kubernetes.io/projected/f43b209a-c264-46e2-ac59-3ebad3374032-kube-api-access-h2bxd\") pod \"f43b209a-c264-46e2-ac59-3ebad3374032\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " Apr 22 18:21:15.915624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915602 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-cache\") pod \"f43b209a-c264-46e2-ac59-3ebad3374032\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " Apr 22 18:21:15.915624 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915621 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-kserve-provision-location\") pod \"f43b209a-c264-46e2-ac59-3ebad3374032\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " Apr 22 18:21:15.915979 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915734 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-tmp\") pod \"f43b209a-c264-46e2-ac59-3ebad3374032\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " Apr 22 18:21:15.915979 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915764 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43b209a-c264-46e2-ac59-3ebad3374032-tls-certs\") pod \"f43b209a-c264-46e2-ac59-3ebad3374032\" (UID: \"f43b209a-c264-46e2-ac59-3ebad3374032\") " Apr 22 18:21:15.915979 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915812 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f43b209a-c264-46e2-ac59-3ebad3374032" (UID: "f43b209a-c264-46e2-ac59-3ebad3374032"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:15.915979 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.915847 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f43b209a-c264-46e2-ac59-3ebad3374032" (UID: "f43b209a-c264-46e2-ac59-3ebad3374032"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:15.916200 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.916025 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:21:15.916200 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.916042 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:21:15.916200 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.916095 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f43b209a-c264-46e2-ac59-3ebad3374032" (UID: "f43b209a-c264-46e2-ac59-3ebad3374032"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:15.916498 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.916467 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f43b209a-c264-46e2-ac59-3ebad3374032" (UID: "f43b209a-c264-46e2-ac59-3ebad3374032"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:15.917887 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.917864 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43b209a-c264-46e2-ac59-3ebad3374032-kube-api-access-h2bxd" (OuterVolumeSpecName: "kube-api-access-h2bxd") pod "f43b209a-c264-46e2-ac59-3ebad3374032" (UID: "f43b209a-c264-46e2-ac59-3ebad3374032"). InnerVolumeSpecName "kube-api-access-h2bxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:21:15.918033 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:15.917929 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43b209a-c264-46e2-ac59-3ebad3374032-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f43b209a-c264-46e2-ac59-3ebad3374032" (UID: "f43b209a-c264-46e2-ac59-3ebad3374032"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:21:16.017073 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.017041 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:21:16.017073 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.017068 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43b209a-c264-46e2-ac59-3ebad3374032-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:21:16.017073 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.017078 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2bxd\" (UniqueName: \"kubernetes.io/projected/f43b209a-c264-46e2-ac59-3ebad3374032-kube-api-access-h2bxd\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:21:16.017304 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.017089 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43b209a-c264-46e2-ac59-3ebad3374032-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:21:16.821786 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.821690 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8" Apr 22 18:21:16.850389 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.850360 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8"] Apr 22 18:21:16.861122 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:16.861097 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-549c5nzsg8"] Apr 22 18:21:18.554649 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:18.554612 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" path="/var/lib/kubelet/pods/f43b209a-c264-46e2-ac59-3ebad3374032/volumes" Apr 22 18:21:40.680355 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680323 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl"] Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680653 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="storage-initializer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680680 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="storage-initializer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680698 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="storage-initializer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680706 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="storage-initializer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680721 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="tokenizer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680727 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="tokenizer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680741 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="tokenizer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680747 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="tokenizer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680756 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="main" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680761 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="main" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680771 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="main" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680778 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="main" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680835 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="main" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680843 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="72a1797e-7fd7-438e-9ad8-ea42f537a31c" containerName="tokenizer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680850 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="tokenizer" Apr 22 18:21:40.680874 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.680864 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f43b209a-c264-46e2-ac59-3ebad3374032" containerName="main" Apr 22 18:21:40.687246 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.687148 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.691008 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.690989 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-hswrg\"" Apr 22 18:21:40.691143 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.691005 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:21:40.691143 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.691056 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 18:21:40.697211 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.697184 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl"] Apr 22 18:21:40.812062 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.812027 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.812216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.812076 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.812216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.812153 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.812216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.812181 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rjn\" (UniqueName: \"kubernetes.io/projected/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kube-api-access-k7rjn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.812349 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.812230 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.812349 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.812254 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.913656 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.913626 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.913863 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.913683 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.913863 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.913733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.913863 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.913776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.913863 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.913822 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.913863 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.913845 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rjn\" (UniqueName: \"kubernetes.io/projected/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kube-api-access-k7rjn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.914142 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.914076 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.914215 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.914191 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.914262 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.914221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.914262 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.914197 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.916266 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.916247 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.922444 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.922421 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rjn\" (UniqueName: \"kubernetes.io/projected/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kube-api-access-k7rjn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:40.998368 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:40.998306 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:41.128303 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:41.128280 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl"] Apr 22 18:21:41.129852 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:21:41.129824 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda3bb2b_e64d_4eea_a698_201fb5ae9143.slice/crio-269d95132a94be59d971d9319b67e880f8d2df0176b3fb8deee24ca41efd3968 WatchSource:0}: Error finding container 269d95132a94be59d971d9319b67e880f8d2df0176b3fb8deee24ca41efd3968: Status 404 returned error can't find the container with id 269d95132a94be59d971d9319b67e880f8d2df0176b3fb8deee24ca41efd3968 Apr 22 18:21:41.915963 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:41.915923 2564 generic.go:358] "Generic (PLEG): container finished" podID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerID="162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0" exitCode=0 Apr 22 18:21:41.916370 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:41.916003 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerDied","Data":"162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0"} Apr 22 18:21:41.916370 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:41.916039 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerStarted","Data":"269d95132a94be59d971d9319b67e880f8d2df0176b3fb8deee24ca41efd3968"} Apr 22 18:21:42.921486 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:42.921454 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerStarted","Data":"520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb"} Apr 22 18:21:42.921486 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:42.921489 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerStarted","Data":"06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a"} Apr 22 18:21:42.921934 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:42.921601 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:42.943911 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:42.943859 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" podStartSLOduration=2.943844703 podStartE2EDuration="2.943844703s" podCreationTimestamp="2026-04-22 18:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:42.943485578 +0000 UTC m=+1723.002388719" watchObservedRunningTime="2026-04-22 18:21:42.943844703 +0000 UTC m=+1723.002747844" Apr 22 18:21:50.998831 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:50.998797 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:50.998831 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:50.998841 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:51.001388 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:51.001364 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:21:51.963385 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:21:51.963345 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:22:12.963253 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:22:12.963223 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:23:00.604513 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:23:00.604483 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:23:00.610523 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:23:00.610504 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:24:12.572079 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:12.571996 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl"] Apr 22 18:24:12.572655 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:12.572371 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="main" containerID="cri-o://06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a" gracePeriod=30 Apr 22 18:24:12.572655 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:12.572431 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="tokenizer" containerID="cri-o://520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb" gracePeriod=30 Apr 22 18:24:12.962198 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:24:12.962162 2564 logging.go:55] [core] [Channel #541 SubChannel #542]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.56:9003", ServerName: "10.133.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.56:9003: connect: connection refused" Apr 22 18:24:13.507547 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:13.507512 2564 generic.go:358] "Generic (PLEG): container finished" podID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerID="06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a" exitCode=0 Apr 22 18:24:13.507766 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:13.507594 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerDied","Data":"06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a"} Apr 22 18:24:13.920997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:13.920968 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:24:13.961829 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:13.961792 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.56:9003\" within 1s: context deadline exceeded" Apr 22 18:24:14.001087 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001059 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tls-certs\") pod \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " Apr 22 18:24:14.001237 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001100 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-cache\") pod \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " Apr 22 18:24:14.001237 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001142 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kserve-provision-location\") pod \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " Apr 22 18:24:14.001237 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001161 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-uds\") pod \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " Apr 22 18:24:14.001237 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001182 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rjn\" (UniqueName: \"kubernetes.io/projected/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kube-api-access-k7rjn\") pod \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " Apr 22 18:24:14.001237 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001203 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-tmp\") pod \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\" (UID: \"bda3bb2b-e64d-4eea-a698-201fb5ae9143\") " Apr 22 18:24:14.001514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001340 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "bda3bb2b-e64d-4eea-a698-201fb5ae9143" (UID: "bda3bb2b-e64d-4eea-a698-201fb5ae9143"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:14.001514 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001426 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "bda3bb2b-e64d-4eea-a698-201fb5ae9143" (UID: "bda3bb2b-e64d-4eea-a698-201fb5ae9143"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:14.001622 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001553 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-cache\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:14.001622 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001574 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-uds\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:14.001733 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001617 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "bda3bb2b-e64d-4eea-a698-201fb5ae9143" (UID: "bda3bb2b-e64d-4eea-a698-201fb5ae9143"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:14.001939 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.001919 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bda3bb2b-e64d-4eea-a698-201fb5ae9143" (UID: "bda3bb2b-e64d-4eea-a698-201fb5ae9143"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:14.003430 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.003401 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bda3bb2b-e64d-4eea-a698-201fb5ae9143" (UID: "bda3bb2b-e64d-4eea-a698-201fb5ae9143"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:14.003543 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.003524 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kube-api-access-k7rjn" (OuterVolumeSpecName: "kube-api-access-k7rjn") pod "bda3bb2b-e64d-4eea-a698-201fb5ae9143" (UID: "bda3bb2b-e64d-4eea-a698-201fb5ae9143"). InnerVolumeSpecName "kube-api-access-k7rjn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:14.035864 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.035829 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gwnwk/must-gather-sxsxm"] Apr 22 18:24:14.036227 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036210 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="main" Apr 22 18:24:14.036308 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036230 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="main" Apr 22 18:24:14.036308 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036253 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="tokenizer" Apr 22 18:24:14.036308 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036261 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="tokenizer" Apr 22 18:24:14.036308 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036279 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="storage-initializer" Apr 22 18:24:14.036308 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036289 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="storage-initializer" Apr 22 18:24:14.036560 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036372 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="main" Apr 22 18:24:14.036560 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.036388 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerName="tokenizer" Apr 22 18:24:14.040773 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.040718 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.043481 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.043460 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gwnwk\"/\"default-dockercfg-nvq2m\"" Apr 22 18:24:14.043591 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.043465 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwnwk\"/\"openshift-service-ca.crt\"" Apr 22 18:24:14.043649 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.043612 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwnwk\"/\"kube-root-ca.crt\"" Apr 22 18:24:14.049572 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.049553 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwnwk/must-gather-sxsxm"] Apr 22 18:24:14.102562 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.102537 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfg4f\" (UniqueName: \"kubernetes.io/projected/aea5eb57-6900-4813-8c7a-01280b14c407-kube-api-access-nfg4f\") pod \"must-gather-sxsxm\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.102722 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.102582 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aea5eb57-6900-4813-8c7a-01280b14c407-must-gather-output\") pod \"must-gather-sxsxm\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.102805 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.102750 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kserve-provision-location\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:14.102805 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.102773 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7rjn\" (UniqueName: \"kubernetes.io/projected/bda3bb2b-e64d-4eea-a698-201fb5ae9143-kube-api-access-k7rjn\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:14.102805 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.102790 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tokenizer-tmp\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:14.102912 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.102806 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3bb2b-e64d-4eea-a698-201fb5ae9143-tls-certs\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:14.203605 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.203565 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aea5eb57-6900-4813-8c7a-01280b14c407-must-gather-output\") pod \"must-gather-sxsxm\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.203783 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.203696 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfg4f\" (UniqueName: \"kubernetes.io/projected/aea5eb57-6900-4813-8c7a-01280b14c407-kube-api-access-nfg4f\") pod \"must-gather-sxsxm\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.203891 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.203873 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aea5eb57-6900-4813-8c7a-01280b14c407-must-gather-output\") pod \"must-gather-sxsxm\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.211704 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.211661 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfg4f\" (UniqueName: \"kubernetes.io/projected/aea5eb57-6900-4813-8c7a-01280b14c407-kube-api-access-nfg4f\") pod \"must-gather-sxsxm\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.360280 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.360212 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:14.488662 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.488638 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwnwk/must-gather-sxsxm"] Apr 22 18:24:14.489933 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:24:14.489908 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea5eb57_6900_4813_8c7a_01280b14c407.slice/crio-536a2c363d24aa19a0005fa58668b51a820fe7235fac21ac3e105ba3fb20a5b1 WatchSource:0}: Error finding container 536a2c363d24aa19a0005fa58668b51a820fe7235fac21ac3e105ba3fb20a5b1: Status 404 returned error can't find the container with id 536a2c363d24aa19a0005fa58668b51a820fe7235fac21ac3e105ba3fb20a5b1 Apr 22 18:24:14.491635 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.491620 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:24:14.512365 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.512337 2564 generic.go:358] "Generic (PLEG): container finished" podID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" containerID="520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb" exitCode=0 Apr 22 18:24:14.512473 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.512413 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" Apr 22 18:24:14.512527 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.512414 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerDied","Data":"520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb"} Apr 22 18:24:14.512527 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.512511 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl" event={"ID":"bda3bb2b-e64d-4eea-a698-201fb5ae9143","Type":"ContainerDied","Data":"269d95132a94be59d971d9319b67e880f8d2df0176b3fb8deee24ca41efd3968"} Apr 22 18:24:14.512613 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.512530 2564 scope.go:117] "RemoveContainer" containerID="520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb" Apr 22 18:24:14.513455 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.513430 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" event={"ID":"aea5eb57-6900-4813-8c7a-01280b14c407","Type":"ContainerStarted","Data":"536a2c363d24aa19a0005fa58668b51a820fe7235fac21ac3e105ba3fb20a5b1"} Apr 22 18:24:14.521374 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.521358 2564 scope.go:117] "RemoveContainer" containerID="06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a" Apr 22 18:24:14.528533 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.528515 2564 scope.go:117] "RemoveContainer" containerID="162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0" Apr 22 18:24:14.535767 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.535744 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl"] Apr 22 18:24:14.536306 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.536285 2564 scope.go:117] "RemoveContainer" containerID="520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb" Apr 22 18:24:14.536559 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:24:14.536535 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb\": container with ID starting with 520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb not found: ID does not exist" containerID="520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb" Apr 22 18:24:14.536710 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.536572 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb"} err="failed to get container status \"520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb\": rpc error: code = NotFound desc = could not find container \"520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb\": container with ID starting with 520495a8f84d69b07be7b091cd88bc2febaf6e5bd5354da593ae33bbae3ee2fb not found: ID does not exist" Apr 22 18:24:14.536710 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.536594 2564 scope.go:117] "RemoveContainer" containerID="06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a" Apr 22 18:24:14.536937 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:24:14.536906 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a\": container with ID starting with 06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a not found: ID does not exist" containerID="06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a" Apr 22 18:24:14.537221 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.536946 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a"} err="failed to get container status \"06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a\": rpc error: code = NotFound desc = could not find container \"06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a\": container with ID starting with 06611293121db5ea88e4d4b0ca056c9c1d4402884cc685bdf3734e233d0bd94a not found: ID does not exist" Apr 22 18:24:14.537221 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.536969 2564 scope.go:117] "RemoveContainer" containerID="162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0" Apr 22 18:24:14.537375 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:24:14.537242 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0\": container with ID starting with 162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0 not found: ID does not exist" containerID="162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0" Apr 22 18:24:14.537375 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.537273 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0"} err="failed to get container status \"162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0\": rpc error: code = NotFound desc = could not find container \"162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0\": container with ID starting with 162847e292dd4e4acd33431e7634a56ced5f34231045c32661ea531be21340f0 not found: ID does not exist" Apr 22 18:24:14.539382 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.539362 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-84c7db649wm7hl"] Apr 22 18:24:14.554376 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:14.554351 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda3bb2b-e64d-4eea-a698-201fb5ae9143" path="/var/lib/kubelet/pods/bda3bb2b-e64d-4eea-a698-201fb5ae9143/volumes" Apr 22 18:24:18.539278 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:18.539204 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" event={"ID":"aea5eb57-6900-4813-8c7a-01280b14c407","Type":"ContainerStarted","Data":"cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7"} Apr 22 18:24:18.539278 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:18.539255 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" event={"ID":"aea5eb57-6900-4813-8c7a-01280b14c407","Type":"ContainerStarted","Data":"3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd"} Apr 22 18:24:18.555746 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:18.555697 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" podStartSLOduration=0.76483944 podStartE2EDuration="4.555662919s" podCreationTimestamp="2026-04-22 18:24:14 +0000 UTC" firstStartedPulling="2026-04-22 18:24:14.491750581 +0000 UTC m=+1874.550653701" lastFinishedPulling="2026-04-22 18:24:18.282574057 +0000 UTC m=+1878.341477180" observedRunningTime="2026-04-22 18:24:18.553518298 +0000 UTC m=+1878.612421440" watchObservedRunningTime="2026-04-22 18:24:18.555662919 +0000 UTC m=+1878.614566063" Apr 22 18:24:27.829166 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:27.829135 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:28.925900 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:28.925868 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:29.981703 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:29.981652 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:30.986407 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:30.986369 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:31.990332 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:31.990300 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:32.968711 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:32.968662 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:33.983783 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:33.983752 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:35.026100 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:35.026065 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:36.136586 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:36.136539 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:37.240072 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:37.240042 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:38.373788 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:38.373751 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:39.450295 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:39.450264 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:40.571887 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:40.571856 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:41.635180 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:41.635146 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-bnxcd_576b68b8-c039-4804-b52d-0677069a2ec0/istio-proxy/0.log" Apr 22 18:24:42.767880 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:42.767819 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8gm7x_a955e293-a029-4466-b738-2f13eb571d4b/discovery/0.log" Apr 22 18:24:42.788491 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:42.788453 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc_cf03a492-f3f9-4b00-b992-f7ad5d885fea/istio-proxy/0.log" Apr 22 18:24:43.656461 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:43.656430 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8gm7x_a955e293-a029-4466-b738-2f13eb571d4b/discovery/0.log" Apr 22 18:24:43.675009 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:43.674973 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc_cf03a492-f3f9-4b00-b992-f7ad5d885fea/istio-proxy/0.log" Apr 22 18:24:44.616351 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:44.616316 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-jvl99_f4c790fc-d1a0-4a75-86b4-64d57b85b767/manager/0.log" Apr 22 18:24:44.644782 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:44.644747 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-wt7bn_2c6f1aaa-3f4a-4629-af01-23439a1be786/manager/0.log" Apr 22 18:24:45.657221 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:45.657133 2564 generic.go:358] "Generic (PLEG): container finished" podID="aea5eb57-6900-4813-8c7a-01280b14c407" containerID="3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd" exitCode=0 Apr 22 18:24:45.657221 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:45.657207 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" event={"ID":"aea5eb57-6900-4813-8c7a-01280b14c407","Type":"ContainerDied","Data":"3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd"} Apr 22 18:24:45.657692 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:45.657555 2564 scope.go:117] "RemoveContainer" containerID="3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd" Apr 22 18:24:46.342997 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:46.342962 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gwnwk_must-gather-sxsxm_aea5eb57-6900-4813-8c7a-01280b14c407/gather/0.log" Apr 22 18:24:49.818400 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:49.818372 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9w5wv_a3db8c3e-749a-4bb4-b86c-667f4524c8fa/global-pull-secret-syncer/0.log" Apr 22 18:24:49.916598 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:49.916545 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2gj6v_5b6cdf0f-46d3-4ba2-8a30-8314baac3007/konnectivity-agent/0.log" Apr 22 18:24:50.000517 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:50.000491 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-219.ec2.internal_e8d95ccda1ded644c4c87a2ac89e475b/haproxy/0.log" Apr 22 18:24:51.850858 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:51.850814 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gwnwk/must-gather-sxsxm"] Apr 22 18:24:51.851454 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:51.851115 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="copy" containerID="cri-o://cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7" gracePeriod=2 Apr 22 18:24:51.853289 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:51.853252 2564 status_manager.go:895] "Failed to get status for pod" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" err="pods \"must-gather-sxsxm\" is forbidden: User \"system:node:ip-10-0-128-219.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gwnwk\": no relationship found between node 'ip-10-0-128-219.ec2.internal' and this object" Apr 22 18:24:51.853530 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:51.853515 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gwnwk/must-gather-sxsxm"] Apr 22 18:24:52.089812 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.089783 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gwnwk_must-gather-sxsxm_aea5eb57-6900-4813-8c7a-01280b14c407/copy/0.log" Apr 22 18:24:52.090175 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.090160 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:52.092397 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.092368 2564 status_manager.go:895] "Failed to get status for pod" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" err="pods \"must-gather-sxsxm\" is forbidden: User \"system:node:ip-10-0-128-219.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gwnwk\": no relationship found between node 'ip-10-0-128-219.ec2.internal' and this object" Apr 22 18:24:52.139833 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.139742 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aea5eb57-6900-4813-8c7a-01280b14c407-must-gather-output\") pod \"aea5eb57-6900-4813-8c7a-01280b14c407\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " Apr 22 18:24:52.139833 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.139793 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfg4f\" (UniqueName: \"kubernetes.io/projected/aea5eb57-6900-4813-8c7a-01280b14c407-kube-api-access-nfg4f\") pod \"aea5eb57-6900-4813-8c7a-01280b14c407\" (UID: \"aea5eb57-6900-4813-8c7a-01280b14c407\") " Apr 22 18:24:52.142018 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.141983 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea5eb57-6900-4813-8c7a-01280b14c407-kube-api-access-nfg4f" (OuterVolumeSpecName: "kube-api-access-nfg4f") pod "aea5eb57-6900-4813-8c7a-01280b14c407" (UID: "aea5eb57-6900-4813-8c7a-01280b14c407"). InnerVolumeSpecName "kube-api-access-nfg4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:52.145873 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.145847 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea5eb57-6900-4813-8c7a-01280b14c407-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "aea5eb57-6900-4813-8c7a-01280b14c407" (UID: "aea5eb57-6900-4813-8c7a-01280b14c407"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:52.241076 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.241034 2564 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aea5eb57-6900-4813-8c7a-01280b14c407-must-gather-output\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:52.241076 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.241069 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nfg4f\" (UniqueName: \"kubernetes.io/projected/aea5eb57-6900-4813-8c7a-01280b14c407-kube-api-access-nfg4f\") on node \"ip-10-0-128-219.ec2.internal\" DevicePath \"\"" Apr 22 18:24:52.554200 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.554160 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" path="/var/lib/kubelet/pods/aea5eb57-6900-4813-8c7a-01280b14c407/volumes" Apr 22 18:24:52.686693 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.686640 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gwnwk_must-gather-sxsxm_aea5eb57-6900-4813-8c7a-01280b14c407/copy/0.log" Apr 22 18:24:52.686973 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.686949 2564 generic.go:358] "Generic (PLEG): container finished" podID="aea5eb57-6900-4813-8c7a-01280b14c407" containerID="cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7" exitCode=143 Apr 22 18:24:52.687037 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.687000 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwnwk/must-gather-sxsxm" Apr 22 18:24:52.687089 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.687076 2564 scope.go:117] "RemoveContainer" containerID="cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7" Apr 22 18:24:52.695152 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.695131 2564 scope.go:117] "RemoveContainer" containerID="3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd" Apr 22 18:24:52.709906 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.709887 2564 scope.go:117] "RemoveContainer" containerID="cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7" Apr 22 18:24:52.710199 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:24:52.710178 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7\": container with ID starting with cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7 not found: ID does not exist" containerID="cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7" Apr 22 18:24:52.710306 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.710211 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7"} err="failed to get container status \"cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7\": rpc error: code = NotFound desc = could not find container \"cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7\": container with ID starting with cc4c230bf30ca17db8b1300223f8870dc9d0ddf1a9385321a98fe850b488c2b7 not found: ID does not exist" Apr 22 18:24:52.710306 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.710240 2564 scope.go:117] "RemoveContainer" containerID="3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd" Apr 22 18:24:52.710454 ip-10-0-128-219 kubenswrapper[2564]: E0422 18:24:52.710432 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd\": container with ID starting with 3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd not found: ID does not exist" containerID="3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd" Apr 22 18:24:52.710495 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:52.710461 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd"} err="failed to get container status \"3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd\": rpc error: code = NotFound desc = could not find container \"3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd\": container with ID starting with 3e3f268311317b0cec5f23945c79621e0ca84b36284b5b47946c5321a92525cd not found: ID does not exist" Apr 22 18:24:54.137597 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:54.137557 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-jvl99_f4c790fc-d1a0-4a75-86b4-64d57b85b767/manager/0.log" Apr 22 18:24:54.191257 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:54.191222 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-wt7bn_2c6f1aaa-3f4a-4629-af01-23439a1be786/manager/0.log" Apr 22 18:24:55.652188 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:55.652156 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g4nn6_886952ac-1b2d-4421-a069-8ec990d20254/node-exporter/0.log" Apr 22 18:24:55.687623 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:55.687589 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g4nn6_886952ac-1b2d-4421-a069-8ec990d20254/kube-rbac-proxy/0.log" Apr 22 18:24:55.713691 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:55.713646 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g4nn6_886952ac-1b2d-4421-a069-8ec990d20254/init-textfile/0.log" Apr 22 18:24:56.057122 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:56.057086 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gn6mt_d1bc1717-c469-4566-bd31-fca0ae08a007/prometheus-operator/0.log" Apr 22 18:24:56.083810 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:56.083785 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gn6mt_d1bc1717-c469-4566-bd31-fca0ae08a007/kube-rbac-proxy/0.log" Apr 22 18:24:56.110888 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:56.110854 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-7nmdt_4ed46985-5f70-4cda-a833-af43d0d97e60/prometheus-operator-admission-webhook/0.log" Apr 22 18:24:57.507929 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:57.507899 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-gbrpj_65900b80-a3e4-4f4d-bb90-dc5cac183f53/networking-console-plugin/0.log" Apr 22 18:24:58.444721 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.444688 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b9d5cdc8-6nn7g_157343db-7d66-43bf-b8d8-3d8e74b5c1fc/console/0.log" Apr 22 18:24:58.477744 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.477714 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rb77d_61b29203-84c4-4fd8-a19e-9a8647316762/download-server/0.log" Apr 22 18:24:58.714191 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714118 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n"] Apr 22 18:24:58.714600 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714464 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="copy" Apr 22 18:24:58.714600 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714476 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="copy" Apr 22 18:24:58.714600 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714495 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="gather" Apr 22 18:24:58.714600 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714502 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="gather" Apr 22 18:24:58.714600 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714584 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="gather" Apr 22 18:24:58.714600 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.714595 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="aea5eb57-6900-4813-8c7a-01280b14c407" containerName="copy" Apr 22 18:24:58.721410 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.721391 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.723942 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.723919 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kx68p\"/\"kube-root-ca.crt\"" Apr 22 18:24:58.724895 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.724876 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kx68p\"/\"default-dockercfg-c5qsr\"" Apr 22 18:24:58.724973 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.724912 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kx68p\"/\"openshift-service-ca.crt\"" Apr 22 18:24:58.726692 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.726655 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n"] Apr 22 18:24:58.794878 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.794851 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-proc\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.794996 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.794892 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-sys\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.794996 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.794908 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j79c\" (UniqueName: \"kubernetes.io/projected/a23deeb7-69ec-433b-8fc3-306d23b6c244-kube-api-access-6j79c\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.794996 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.794935 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-podres\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.794996 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.794957 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-lib-modules\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896275 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896240 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-lib-modules\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-proc\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-sys\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896407 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-lib-modules\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896436 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896407 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j79c\" (UniqueName: \"kubernetes.io/projected/a23deeb7-69ec-433b-8fc3-306d23b6c244-kube-api-access-6j79c\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896643 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896451 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-proc\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896643 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896458 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-podres\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896643 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896458 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-sys\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.896643 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.896544 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a23deeb7-69ec-433b-8fc3-306d23b6c244-podres\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:58.904961 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:58.904940 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j79c\" (UniqueName: \"kubernetes.io/projected/a23deeb7-69ec-433b-8fc3-306d23b6c244-kube-api-access-6j79c\") pod \"perf-node-gather-daemonset-mrc9n\" (UID: \"a23deeb7-69ec-433b-8fc3-306d23b6c244\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:59.032058 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.031978 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:59.152251 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.152225 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n"] Apr 22 18:24:59.153469 ip-10-0-128-219 kubenswrapper[2564]: W0422 18:24:59.153448 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda23deeb7_69ec_433b_8fc3_306d23b6c244.slice/crio-3050c59bf736f54ecffe1825eb5d52f7ac190ab575164b1dd74f0e5519ba0844 WatchSource:0}: Error finding container 3050c59bf736f54ecffe1825eb5d52f7ac190ab575164b1dd74f0e5519ba0844: Status 404 returned error can't find the container with id 3050c59bf736f54ecffe1825eb5d52f7ac190ab575164b1dd74f0e5519ba0844 Apr 22 18:24:59.677994 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.677970 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8825w_3a00fffd-ba82-45c7-b379-68e21fd2f1f1/dns/0.log" Apr 22 18:24:59.697004 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.696985 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8825w_3a00fffd-ba82-45c7-b379-68e21fd2f1f1/kube-rbac-proxy/0.log" Apr 22 18:24:59.716502 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.716476 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" event={"ID":"a23deeb7-69ec-433b-8fc3-306d23b6c244","Type":"ContainerStarted","Data":"d50ea2613e7fedb3c22f97f24916eac6571e50c771374d62888b9d8eb91b7c5a"} Apr 22 18:24:59.716502 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.716504 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" event={"ID":"a23deeb7-69ec-433b-8fc3-306d23b6c244","Type":"ContainerStarted","Data":"3050c59bf736f54ecffe1825eb5d52f7ac190ab575164b1dd74f0e5519ba0844"} Apr 22 18:24:59.716911 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.716605 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:24:59.798339 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:24:59.798317 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h4787_8432b695-5ba0-4b5f-bf6e-aea43e93c1a0/dns-node-resolver/0.log" Apr 22 18:25:00.264808 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:00.264777 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6fbdf466c4-c4xwm_a31eed0e-7ff0-4e55-9e16-7fd5c607e632/registry/0.log" Apr 22 18:25:00.320946 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:00.320919 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zs8rm_dfd9fa90-7a02-4429-a3c2-c939fa96e48e/node-ca/0.log" Apr 22 18:25:00.697857 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:00.697813 2564 scope.go:117] "RemoveContainer" containerID="b0f5dd89817e97e4a5fce64064341ae3b2b1ff755a51ed3b6d16d33d66cd3900" Apr 22 18:25:00.705853 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:00.705837 2564 scope.go:117] "RemoveContainer" containerID="325f782abf9bdcbf01eefd733ae9fca101aba1e8947dd7cd87589e79cab3c8d1" Apr 22 18:25:00.713051 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:00.713034 2564 scope.go:117] "RemoveContainer" containerID="ee5e873b534b0503c36a6761f50d30654aa95db0981b5b5335ec9970c73207e3" Apr 22 18:25:01.163888 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:01.163811 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8gm7x_a955e293-a029-4466-b738-2f13eb571d4b/discovery/0.log" Apr 22 18:25:01.186370 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:01.186337 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-wl5mc_cf03a492-f3f9-4b00-b992-f7ad5d885fea/istio-proxy/0.log" Apr 22 18:25:01.660391 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:01.660364 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vzgzr_3400960b-c044-44c8-b84c-550071e3f93e/serve-healthcheck-canary/0.log" Apr 22 18:25:02.172887 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:02.172861 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-clh2q_61075974-e581-432e-8332-a5b8e03775a9/kube-rbac-proxy/0.log" Apr 22 18:25:02.193800 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:02.193765 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-clh2q_61075974-e581-432e-8332-a5b8e03775a9/exporter/0.log" Apr 22 18:25:02.211984 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:02.211956 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-clh2q_61075974-e581-432e-8332-a5b8e03775a9/extractor/0.log" Apr 22 18:25:05.400796 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:05.400765 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-2kmcl_97e86232-7d05-4c7d-8cda-94d1656d1324/server/0.log" Apr 22 18:25:05.673932 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:05.673909 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-jd9vw_d0dd739a-946e-4fc7-b3a3-0cc2a0d5387a/seaweedfs/0.log" Apr 22 18:25:05.730610 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:05.730585 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" Apr 22 18:25:05.753628 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:05.753586 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-mrc9n" podStartSLOduration=7.753572511 podStartE2EDuration="7.753572511s" podCreationTimestamp="2026-04-22 18:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:24:59.733487279 +0000 UTC m=+1919.792390423" watchObservedRunningTime="2026-04-22 18:25:05.753572511 +0000 UTC m=+1925.812475652" Apr 22 18:25:10.143899 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:10.143868 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-hkbk9_de69c135-af81-4f57-8071-29c9454db61d/migrator/0.log" Apr 22 18:25:10.161125 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:10.161104 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-hkbk9_de69c135-af81-4f57-8071-29c9454db61d/graceful-termination/0.log" Apr 22 18:25:11.404359 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.404333 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7wtdq_221a3ce4-df39-49c4-9142-6acb37f99613/kube-multus/0.log" Apr 22 18:25:11.440224 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.440201 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/kube-multus-additional-cni-plugins/0.log" Apr 22 18:25:11.469641 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.469615 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/egress-router-binary-copy/0.log" Apr 22 18:25:11.498966 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.498946 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/cni-plugins/0.log" Apr 22 18:25:11.542216 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.542181 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/bond-cni-plugin/0.log" Apr 22 18:25:11.584037 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.584015 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/routeoverride-cni/0.log" Apr 22 18:25:11.621908 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.621889 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/whereabouts-cni-bincopy/0.log" Apr 22 18:25:11.670058 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:11.670037 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kz84r_a3fc6e2c-71c7-4da3-b348-d4a5b505f72a/whereabouts-cni/0.log" Apr 22 18:25:12.059460 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:12.059376 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-t6kpj_1273b1fd-25f6-4315-a692-c599fb3e48b7/network-metrics-daemon/0.log" Apr 22 18:25:12.093245 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:12.093223 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-t6kpj_1273b1fd-25f6-4315-a692-c599fb3e48b7/kube-rbac-proxy/0.log" Apr 22 18:25:13.027119 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.027094 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-controller/0.log" Apr 22 18:25:13.042366 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.042333 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/0.log" Apr 22 18:25:13.058879 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.058854 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovn-acl-logging/1.log" Apr 22 18:25:13.077616 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.077589 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/kube-rbac-proxy-node/0.log" Apr 22 18:25:13.095762 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.095741 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:25:13.112829 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.112800 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/northd/0.log" Apr 22 18:25:13.131685 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.131648 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/nbdb/0.log" Apr 22 18:25:13.153603 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.153581 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/sbdb/0.log" Apr 22 18:25:13.324065 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:13.323976 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9vgh6_9d8dc6eb-7c99-4548-8b6d-fe9f31000478/ovnkube-controller/0.log" Apr 22 18:25:15.015283 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:15.015256 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kbkmw_3c596acd-7332-4aab-afbb-73b8773fb825/network-check-target-container/0.log" Apr 22 18:25:15.960650 ip-10-0-128-219 kubenswrapper[2564]: I0422 18:25:15.960624 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8znkx_ca34e7e4-d295-4bc4-adff-31d08074df10/iptables-alerter/0.log"