Apr 16 19:53:55.676289 ip-10-0-138-142 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:56.117723 ip-10-0-138-142 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:56.117723 ip-10-0-138-142 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:56.117723 ip-10-0-138-142 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:56.117723 ip-10-0-138-142 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:56.117723 ip-10-0-138-142 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:56.120336 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.120203 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:56.123256 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123240 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123257 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123262 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123266 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123269 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123272 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123275 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.123295 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.123278 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.127074 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127065 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.127074 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127074 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127077 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127081 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127084 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127088 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127091 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127095 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127097 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127100 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127105 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127108 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127111 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127119 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127122 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127125 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127128 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127131 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127133 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127136 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.127140 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127139 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127141 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127144 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127147 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127150 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127153 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127155 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127158 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127160 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127163 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127165 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127169 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127172 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127174 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127177 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127180 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127182 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127185 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127187 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127190 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.127587 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127192 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127195 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127197 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127200 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127202 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127205 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127208 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127211 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127214 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127218 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127220 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127222 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127225 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127227 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127231 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127234 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127237 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127240 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127243 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127245 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.128103 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127254 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127257 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127260 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127262 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127265 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127267 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127270 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127272 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127275 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127277 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127280 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127283 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127285 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127288 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127290 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127292 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127295 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127297 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127724 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127729 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.128615 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127732 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127735 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127737 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127740 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127743 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127746 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127749 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127752 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127755 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127757 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127760 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127763 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127766 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127769 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127771 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127774 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127776 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127779 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127781 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.129114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127784 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127787 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127789 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127791 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127794 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127796 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127799 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127802 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127804 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127806 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127809 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127812 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127815 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127817 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127820 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127822 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127825 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127827 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127830 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127833 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.129576 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127836 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127838 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127841 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127843 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127846 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127848 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127851 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127854 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127856 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127859 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127861 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127864 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127868 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127872 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127876 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127880 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127883 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127885 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127888 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.130090 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127890 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127893 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127895 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127898 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127901 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127904 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127906 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127909 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127912 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127914 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127917 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127919 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127922 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127925 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127927 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127929 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127933 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127935 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127938 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127940 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.130549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127945 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127948 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127950 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127953 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127956 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.127958 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129223 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129235 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129242 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129247 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129252 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129255 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129260 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129264 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129267 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129270 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129274 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129277 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129281 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129284 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129287 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129290 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129294 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:56.131049 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129297 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129300 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129305 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129308 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129311 2569 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129314 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129318 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129322 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129326 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129329 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129333 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129336 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129339 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129342 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129345 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129348 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129353 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129360 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129363 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129366 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129369 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129372 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129376 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129380 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129383 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:56.131601 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129386 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129389 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129393 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129397 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129400 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129403 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129406 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129409 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129412 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129415 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129418 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129421 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129424 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129428 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129430 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129434 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129453 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129457 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129461 2569 flags.go:64] FLAG: --help="false" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129464 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129467 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129470 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129473 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129477 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:56.132232 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129482 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129485 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129488 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129490 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129493 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129496 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129507 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129511 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129515 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129518 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129521 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129524 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129527 2569 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129529 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129532 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129536 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129542 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129544 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129547 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129550 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129553 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129556 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129559 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129562 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129566 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:56.132812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129569 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129574 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129578 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129581 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129584 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129587 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129590 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129594 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129597 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129604 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129607 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129610 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129614 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129616 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129622 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129625 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129628 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129631 2569 flags.go:64] FLAG: --port="10250" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129634 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129637 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f91268c4793f4ed7" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129640 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129643 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129646 2569 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129649 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:56.133463 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129651 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129655 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129658 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129661 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129664 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129668 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129671 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129674 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129677 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129680 2569 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129683 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129686 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129689 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129692 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129695 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129698 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129701 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129704 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129707 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129710 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129713 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129716 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129719 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129722 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129724 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:56.134129 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129730 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129733 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129736 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129740 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129743 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129746 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129748 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129751 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129754 2569 flags.go:64] FLAG: --v="2" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129759 2569 flags.go:64] FLAG: --version="false" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129763 2569 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129768 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.129771 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129869 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129875 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129878 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129881 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129884 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129887 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129890 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129893 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129895 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.134728 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129898 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129901 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129904 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129906 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129909 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129911 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129914 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129916 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129919 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129921 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129924 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129927 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129929 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129932 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129936 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129940 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129943 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129946 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129948 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129951 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.135278 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129953 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129956 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129958 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129962 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129965 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129967 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129970 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129972 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129975 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129978 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129980 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129983 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129985 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129988 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129990 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129993 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129996 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.129998 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130000 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130003 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.135858 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130006 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130020 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130023 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130026 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130028 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130031 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130033 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130036 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130039 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130041 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130044 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130046 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130049 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130052 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130054 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130058 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130061 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130063 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130066 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130068 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.136783 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130071 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130074 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130077 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130083 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130086 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130089 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130092 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130095 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130097 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130100 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130103 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130105 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130108 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130111 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130113 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130116 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.137722 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.130119 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.130124 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.137918 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.137939 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138048 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138057 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138062 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138068 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138073 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138078 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138083 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138087 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138091 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138095 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138099 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.138454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138103 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138108 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138112 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138117 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138121 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138125 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138129 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138134 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138146 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138152 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138156 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138160 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138164 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138169 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138173 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138177 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138181 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138185 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138189 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138193 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.139141 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138207 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138212 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138216 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138221 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138225 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138229 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138233 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138237 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138241 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138245 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138249 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138254 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138258 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138262 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138266 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138270 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138275 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138278 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138282 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138286 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.139702 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138290 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138294 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138301 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138308 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138313 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138318 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138322 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138326 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138330 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138334 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138339 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138343 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138348 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138360 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138365 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138371 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138377 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138382 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138386 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.140222 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138390 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138394 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138398 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138402 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138406 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138410 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138414 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138419 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138423 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138427 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138431 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138435 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138440 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138444 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138448 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138452 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.140755 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.138459 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138656 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138664 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138669 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138674 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138679 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138683 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138688 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138693 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138698 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138702 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138715 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138720 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138724 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138728 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138732 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138736 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138741 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138745 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138749 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.141444 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138753 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138757 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138761 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138765 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138769 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138773 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138777 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138781 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138785 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138789 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138793 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138797 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138801 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138805 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138809 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138813 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138818 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138823 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138827 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.142311 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138831 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138836 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138840 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138845 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138849 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138860 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138864 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138868 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138873 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138877 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138881 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138885 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138891 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138897 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138902 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138906 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138910 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138914 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138918 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.142848 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138922 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138926 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138930 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138934 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138938 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138945 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138951 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138955 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138960 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138964 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138969 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138973 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138977 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138981 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138985 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138989 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138993 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.138997 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139001 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139034 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.143340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139039 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139043 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139047 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139051 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139055 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139059 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139063 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139067 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:56.139071 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.139079 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:56.143823 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.140240 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:56.144903 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.144887 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:56.146037 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.146023 2569 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:56.146142 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.146121 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:56.147028 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.147006 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:56.173637 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.173611 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:56.181384 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.181357 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:56.196495 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.196476 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:56.202469 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.202452 2569 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:56.203629 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.203613 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:56.206671 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.206652 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 851a99cd-0dd3-4e7d-bb13-722f2ed055df:/dev/nvme0n1p4 ed272d45-63b0-4c5c-a27e-e4190b80fec5:/dev/nvme0n1p3] Apr 16 19:53:56.206730 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.206671 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:56.212321 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.212306 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:56.212746 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.212644 2569 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:56.210733781 +0000 UTC m=+0.411443850 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3138620 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29e7a8cfd7df8b1953ec87c9c3c262 SystemUUID:ec29e7a8-cfd7-df8b-1953-ec87c9c3c262 BootID:20508c15-5fbb-4002-8072-9282d4ce136f Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d2:c3:a6:4f:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d2:c3:a6:4f:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:be:16:a3:8c:a1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:56.212746 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.212742 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:56.212835 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.212828 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:56.213879 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.213858 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:56.214040 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.213881 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:56.214088 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.214049 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:56.214088 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.214058 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:56.214088 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.214071 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:56.215000 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.214990 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:56.215759 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.215750 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:56.215867 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.215858 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:56.218664 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.218654 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:56.218701 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.218673 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:56.218701 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.218685 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:56.218701 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.218694 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:56.218803 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.218703 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:56.220386 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.220374 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:56.220423 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.220398 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:56.223804 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.223789 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:56.226810 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.226390 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:56.228944 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228928 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228947 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228954 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228959 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228965 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228971 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228976 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228983 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228989 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:56.229002 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.228996 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:56.229253 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.229026 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:56.229253 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.229036 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:56.229789 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.229775 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:56.229789 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.229787 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:56.230415 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.230390 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:56.230467 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.230429 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:56.231821 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.231804 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:56.233184 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.233172 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:56.233239 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.233210 2569 server.go:1295] "Started kubelet" Apr 16 19:53:56.233318 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.233278 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:56.233404 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.233368 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:56.233456 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.233424 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:56.233969 ip-10-0-138-142 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:56.234529 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.234480 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:56.235284 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.235267 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:56.241524 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.241507 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w7k76" Apr 16 19:53:56.242138 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.242118 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:56.242596 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.242579 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:56.243434 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243411 2569 factory.go:55] Registering systemd factory Apr 16 19:53:56.243434 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243431 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:56.243583 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243567 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:56.243751 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.243727 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.243818 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243691 2569 factory.go:153] Registering CRI-O factory Apr 16 19:53:56.243818 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243766 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:56.243905 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243819 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:56.243905 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243839 2569 factory.go:103] Registering Raw factory Apr 16 19:53:56.243905 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243852 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:56.244066 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243598 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:56.244066 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.243736 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:56.244158 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.244076 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:56.244158 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.244086 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:56.244384 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.244372 2569 manager.go:319] Starting recovery of all containers Apr 16 19:53:56.244885 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.244864 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:56.247996 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.247968 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:56.248095 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.248066 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:56.249150 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.248195 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-142.ec2.internal.18a6ee68a39c62ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-142.ec2.internal,UID:ip-10-0-138-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-142.ec2.internal,},FirstTimestamp:2026-04-16 19:53:56.233183917 +0000 UTC m=+0.433893986,LastTimestamp:2026-04-16 19:53:56.233183917 +0000 UTC m=+0.433893986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-142.ec2.internal,}" Apr 16 19:53:56.250877 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.250850 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w7k76" Apr 16 19:53:56.254407 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.254273 2569 manager.go:324] Recovery completed Apr 16 19:53:56.259469 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.259457 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.262238 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.262219 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.262298 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.262261 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.262298 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.262278 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.262743 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.262726 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:56.262743 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.262743 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:56.262862 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.262760 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:56.264422 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.264358 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-142.ec2.internal.18a6ee68a557c1a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-142.ec2.internal,UID:ip-10-0-138-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-142.ec2.internal,},FirstTimestamp:2026-04-16 19:53:56.262240674 +0000 UTC m=+0.462950762,LastTimestamp:2026-04-16 19:53:56.262240674 +0000 UTC m=+0.462950762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-142.ec2.internal,}" Apr 16 19:53:56.265366 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.265352 2569 policy_none.go:49] "None policy: Start" Apr 16 19:53:56.265431 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.265371 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:56.265846 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.265834 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:56.303548 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.303534 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.303569 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.303581 2569 server.go:85] "Starting device plugin registration server" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.303803 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.303816 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.303936 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.304069 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.304081 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.304459 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:56.316578 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.304492 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.365299 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.365262 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:56.366468 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.366448 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:56.366468 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.366470 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:56.366604 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.366486 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:56.366604 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.366494 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:56.366604 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.366522 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:56.369455 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.369409 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:56.403919 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.403896 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.404732 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.404714 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.404820 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.404743 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.404820 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.404753 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.404820 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.404776 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.412713 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.412698 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.412764 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.412721 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-142.ec2.internal\": node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.428634 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.428614 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.466749 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.466719 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal"] Apr 16 19:53:56.466829 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.466790 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.467834 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.467810 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.467883 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.467837 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.467941 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.467883 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.469249 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.469236 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.469394 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.469375 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.469430 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.469411 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.471357 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.471335 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.471357 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.471352 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.471458 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.471361 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.471458 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.471373 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.471458 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.471386 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.471458 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.471374 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.472640 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.472625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.472727 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.472650 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.473352 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.473335 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.473442 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.473372 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.473442 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.473391 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.509732 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.509708 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-142.ec2.internal\" not found" node="ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.513914 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.513897 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-142.ec2.internal\" not found" node="ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.528733 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.528715 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.629204 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.629121 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.645477 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.645453 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.645583 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.645481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.645583 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.645507 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/027943e939a2d76cdb600f777d89968b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-142.ec2.internal\" (UID: \"027943e939a2d76cdb600f777d89968b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.729898 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.729863 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.746262 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.746237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.746366 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.746269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.746366 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.746295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/027943e939a2d76cdb600f777d89968b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-142.ec2.internal\" (UID: \"027943e939a2d76cdb600f777d89968b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.746366 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.746325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.746366 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.746340 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.746505 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.746326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/027943e939a2d76cdb600f777d89968b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-142.ec2.internal\" (UID: \"027943e939a2d76cdb600f777d89968b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.811397 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.811360 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.817109 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:56.817092 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 16 19:53:56.830812 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.830790 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:56.931396 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:56.931317 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.031867 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.031837 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.132350 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.132319 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.146724 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.146701 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:57.146868 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.146847 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:57.158817 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.158795 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.233272 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.233248 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.242299 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.242275 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:57.253229 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.253203 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:56 +0000 UTC" deadline="2027-10-25 00:58:23.847094985 +0000 UTC" Apr 16 19:53:57.253229 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.253227 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13349h4m26.593871222s" Apr 16 19:53:57.257729 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.257711 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:57.292205 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.292176 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-npqpc" Apr 16 19:53:57.303122 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.303102 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-npqpc" Apr 16 19:53:57.305454 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:57.305431 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027943e939a2d76cdb600f777d89968b.slice/crio-9ca13a89ea6c7f707c65165e096ae20732209a88325025ae74cb15b9e0c6a999 WatchSource:0}: Error finding container 9ca13a89ea6c7f707c65165e096ae20732209a88325025ae74cb15b9e0c6a999: Status 404 returned error can't find the container with id 9ca13a89ea6c7f707c65165e096ae20732209a88325025ae74cb15b9e0c6a999 Apr 16 19:53:57.305976 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:57.305938 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1427eed4f2cd472349fa20b6f1cf215c.slice/crio-8c6ab8da3cbff4b35636947f0da96cfaee18a38d40c6ea14adb5e87391ec4c67 WatchSource:0}: Error finding container 8c6ab8da3cbff4b35636947f0da96cfaee18a38d40c6ea14adb5e87391ec4c67: Status 404 returned error can't find the container with id 8c6ab8da3cbff4b35636947f0da96cfaee18a38d40c6ea14adb5e87391ec4c67 Apr 16 19:53:57.312093 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.312080 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:57.333451 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.333431 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.369631 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.369591 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" event={"ID":"1427eed4f2cd472349fa20b6f1cf215c","Type":"ContainerStarted","Data":"8c6ab8da3cbff4b35636947f0da96cfaee18a38d40c6ea14adb5e87391ec4c67"} Apr 16 19:53:57.370462 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.370444 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" event={"ID":"027943e939a2d76cdb600f777d89968b","Type":"ContainerStarted","Data":"9ca13a89ea6c7f707c65165e096ae20732209a88325025ae74cb15b9e0c6a999"} Apr 16 19:53:57.433568 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.433546 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.534124 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.534048 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.634609 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:57.634572 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 16 19:53:57.655326 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.655300 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.686868 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.686838 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.743535 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.743499 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 16 19:53:57.756422 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.756396 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:57.757454 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.757433 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 16 19:53:57.767145 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:57.767117 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:58.220259 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.220228 2569 apiserver.go:52] "Watching apiserver" Apr 16 19:53:58.231875 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.231845 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:58.234779 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.234744 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w","openshift-image-registry/node-ca-b88gw","openshift-multus/network-metrics-daemon-bkkfh","openshift-network-operator/iptables-alerter-flbrp","openshift-ovn-kubernetes/ovnkube-node-qt9lg","kube-system/konnectivity-agent-lb4zl","kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal","openshift-cluster-node-tuning-operator/tuned-jhjgp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal","openshift-multus/multus-additional-cni-plugins-ftsjl","openshift-multus/multus-l22xw","openshift-network-diagnostics/network-check-target-dk2hk"] Apr 16 19:53:58.237668 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.237644 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.238093 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.237889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.238957 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.238934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.239097 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.239071 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:53:58.240330 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240131 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.240330 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240143 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.240330 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240138 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:58.240330 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240264 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6fbqq\"" Apr 16 19:53:58.240564 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240449 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:58.240698 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240679 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rklwx\"" Apr 16 19:53:58.240765 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240688 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.240765 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.240751 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.242554 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.242519 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.242659 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.242644 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.243904 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.243888 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.244878 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.244841 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:58.244978 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.244942 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:58.245278 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245120 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.245278 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245139 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.245278 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245214 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.245278 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245275 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nqwng\"" Apr 16 19:53:58.245480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245401 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:58.245480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245447 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.245805 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.245784 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:58.246576 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.246557 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:58.246725 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.246677 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:58.246725 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.246694 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s7ssk\"" Apr 16 19:53:58.246725 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.246557 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.246923 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.246748 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:58.247835 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.247540 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jm84x\"" Apr 16 19:53:58.247835 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.247551 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.247835 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.247567 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.247835 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.247803 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k2m7p\"" Apr 16 19:53:58.249064 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.248988 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.249329 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.249310 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.251200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.251306 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.251371 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.251576 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.251635 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.251745 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.251936 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:58.252480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.252000 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:58.253826 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.253796 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ts624\"" Apr 16 19:53:58.253913 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.253804 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qnz9r\"" Apr 16 19:53:58.254053 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysctl-conf\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.254117 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-var-lib-kubelet\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.254153 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-registration-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.254562 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.254644 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9067034-59b4-4deb-b07b-6f07d382142d-tmp\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.254644 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b29cde9b-13b6-47a2-bdbe-68511210fa54-iptables-alerter-script\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.254748 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-systemd\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.254748 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254689 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-etc-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.254748 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254724 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.254893 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.254893 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysctl-d\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.254893 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-lib-modules\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.254893 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254850 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b29cde9b-13b6-47a2-bdbe-68511210fa54-host-slash\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.255107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-ovnkube-config\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/267f5c95-39db-40d0-a78a-839da0347dfc-konnectivity-ca\") pod \"konnectivity-agent-lb4zl\" (UID: \"267f5c95-39db-40d0-a78a-839da0347dfc\") " pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.255107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe1490d8-1e7e-4156-9765-d2fec8a38446-host\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.255107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.254999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9067034-59b4-4deb-b07b-6f07d382142d-etc-tuned\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.255107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-socket-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.255107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255093 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-cni-netd\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-ovnkube-script-lib\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxdzw\" (UniqueName: \"kubernetes.io/projected/05bff05d-2c89-41be-b7ea-03dd408b9294-kube-api-access-pxdzw\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/267f5c95-39db-40d0-a78a-839da0347dfc-agent-certs\") pod \"konnectivity-agent-lb4zl\" (UID: \"267f5c95-39db-40d0-a78a-839da0347dfc\") " pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-run\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255229 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-sys\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255273 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-device-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.255364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdqd\" (UniqueName: \"kubernetes.io/projected/b29cde9b-13b6-47a2-bdbe-68511210fa54-kube-api-access-rtdqd\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255405 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-systemd-units\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-run-netns\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255458 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysconfig\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-var-lib-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-node-log\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-log-socket\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255549 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-cni-bin\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-env-overrides\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05bff05d-2c89-41be-b7ea-03dd408b9294-ovn-node-metrics-cert\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-systemd\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.255702 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255675 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6ghw\" (UniqueName: \"kubernetes.io/projected/fe1490d8-1e7e-4156-9765-d2fec8a38446-kube-api-access-m6ghw\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-host\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbw2\" (UniqueName: \"kubernetes.io/projected/b9067034-59b4-4deb-b07b-6f07d382142d-kube-api-access-mxbw2\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-sys-fs\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm8cg\" (UniqueName: \"kubernetes.io/projected/db66438e-828b-4415-9a27-8eb8615c2db5-kube-api-access-tm8cg\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-ovn\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255841 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-kubernetes\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-kubelet\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255888 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-slash\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-etc-selinux\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255968 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fe1490d8-1e7e-4156-9765-d2fec8a38446-serviceca\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.255995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9h5\" (UniqueName: \"kubernetes.io/projected/ae8d7aa1-7c00-44df-9570-4435defaddc2-kube-api-access-lv9h5\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.256181 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.256040 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-modprobe-d\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.304228 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.304198 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:57 +0000 UTC" deadline="2027-12-26 02:46:51.780436678 +0000 UTC" Apr 16 19:53:58.304228 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.304221 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14838h52m53.47621737s" Apr 16 19:53:58.345121 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.345098 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:58.356701 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356678 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-run-netns\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.356831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05bff05d-2c89-41be-b7ea-03dd408b9294-ovn-node-metrics-cert\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.356831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:53:58.356831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-k8s-cni-cncf-io\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.356831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-run-netns\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.356831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysconfig\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysconfig\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-log-socket\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-log-socket\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-cni-bin\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-cni-bin\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.356977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6ghw\" (UniqueName: \"kubernetes.io/projected/fe1490d8-1e7e-4156-9765-d2fec8a38446-kube-api-access-m6ghw\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbw2\" (UniqueName: \"kubernetes.io/projected/b9067034-59b4-4deb-b07b-6f07d382142d-kube-api-access-mxbw2\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-sys-fs\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357106 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357136 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-kubernetes\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357129 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357172 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-kubelet\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-cnibin\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357222 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-hostroot\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-etc-kubernetes\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-ovnkube-script-lib\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-kubernetes\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdzw\" (UniqueName: \"kubernetes.io/projected/05bff05d-2c89-41be-b7ea-03dd408b9294-kube-api-access-pxdzw\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/267f5c95-39db-40d0-a78a-839da0347dfc-agent-certs\") pod \"konnectivity-agent-lb4zl\" (UID: \"267f5c95-39db-40d0-a78a-839da0347dfc\") " pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-sys-fs\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fe1490d8-1e7e-4156-9765-d2fec8a38446-serviceca\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357394 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysctl-conf\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-var-lib-kubelet\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357449 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-registration-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-systemd\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.357490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-etc-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357507 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-kubelet\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-cnibin\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-os-release\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snns7\" (UniqueName: \"kubernetes.io/projected/0053bd34-312c-4064-8485-b10a2b3b16d7-kube-api-access-snns7\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-etc-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-os-release\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357594 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysctl-conf\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-systemd\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-multus-certs\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysctl-d\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b29cde9b-13b6-47a2-bdbe-68511210fa54-host-slash\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357775 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-system-cni-dir\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9067034-59b4-4deb-b07b-6f07d382142d-etc-tuned\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-sys\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357845 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-systemd-units\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.358217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357851 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fe1490d8-1e7e-4156-9765-d2fec8a38446-serviceca\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357894 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-var-lib-kubelet\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-registration-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357946 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-ovnkube-script-lib\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.357950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358007 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-sys\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-env-overrides\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b29cde9b-13b6-47a2-bdbe-68511210fa54-host-slash\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-var-lib-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-node-log\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358222 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-systemd\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-host\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358337 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm8cg\" (UniqueName: \"kubernetes.io/projected/db66438e-828b-4415-9a27-8eb8615c2db5-kube-api-access-tm8cg\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-ovn\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-cni-bin\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.359116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358430 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-conf-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-systemd-units\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-env-overrides\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-slash\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-host\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-cni-multus\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358556 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vrdf\" (UniqueName: \"kubernetes.io/projected/3609ef37-ae6e-4910-8c8f-420611d9ef42-kube-api-access-5vrdf\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358562 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358592 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-etc-selinux\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-systemd\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358634 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-node-log\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-slash\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-ovn\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-var-lib-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358708 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9h5\" (UniqueName: \"kubernetes.io/projected/ae8d7aa1-7c00-44df-9570-4435defaddc2-kube-api-access-lv9h5\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-etc-selinux\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-modprobe-d\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.359928 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-sysctl-d\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-system-cni-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358929 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-etc-modprobe-d\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.358964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9067034-59b4-4deb-b07b-6f07d382142d-tmp\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.359003 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359003 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b29cde9b-13b6-47a2-bdbe-68511210fa54-iptables-alerter-script\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-run-openvswitch\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.359107 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.859070762 +0000 UTC m=+3.059780824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-cni-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359163 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-netns\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-kubelet\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359216 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-lib-modules\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-ovnkube-config\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/267f5c95-39db-40d0-a78a-839da0347dfc-konnectivity-ca\") pod \"konnectivity-agent-lb4zl\" (UID: \"267f5c95-39db-40d0-a78a-839da0347dfc\") " pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.360783 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359401 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe1490d8-1e7e-4156-9765-d2fec8a38446-host\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359428 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-socket-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-cni-netd\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b29cde9b-13b6-47a2-bdbe-68511210fa54-iptables-alerter-script\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3609ef37-ae6e-4910-8c8f-420611d9ef42-cni-binary-copy\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-socket-dir-parent\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-daemon-config\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-run\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359599 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-lib-modules\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359615 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05bff05d-2c89-41be-b7ea-03dd408b9294-host-cni-netd\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-device-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05bff05d-2c89-41be-b7ea-03dd408b9294-ovnkube-config\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-device-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe1490d8-1e7e-4156-9765-d2fec8a38446-host\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9067034-59b4-4deb-b07b-6f07d382142d-run\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359820 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-socket-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.361575 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db66438e-828b-4415-9a27-8eb8615c2db5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.362375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.359949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdqd\" (UniqueName: \"kubernetes.io/projected/b29cde9b-13b6-47a2-bdbe-68511210fa54-kube-api-access-rtdqd\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.362375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.360818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9067034-59b4-4deb-b07b-6f07d382142d-etc-tuned\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.362375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.360911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/267f5c95-39db-40d0-a78a-839da0347dfc-konnectivity-ca\") pod \"konnectivity-agent-lb4zl\" (UID: \"267f5c95-39db-40d0-a78a-839da0347dfc\") " pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.362375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.361000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05bff05d-2c89-41be-b7ea-03dd408b9294-ovn-node-metrics-cert\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.362375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.361523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/267f5c95-39db-40d0-a78a-839da0347dfc-agent-certs\") pod \"konnectivity-agent-lb4zl\" (UID: \"267f5c95-39db-40d0-a78a-839da0347dfc\") " pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.362375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.362200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9067034-59b4-4deb-b07b-6f07d382142d-tmp\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.373846 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.373822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxdzw\" (UniqueName: \"kubernetes.io/projected/05bff05d-2c89-41be-b7ea-03dd408b9294-kube-api-access-pxdzw\") pod \"ovnkube-node-qt9lg\" (UID: \"05bff05d-2c89-41be-b7ea-03dd408b9294\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.373978 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.373892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6ghw\" (UniqueName: \"kubernetes.io/projected/fe1490d8-1e7e-4156-9765-d2fec8a38446-kube-api-access-m6ghw\") pod \"node-ca-b88gw\" (UID: \"fe1490d8-1e7e-4156-9765-d2fec8a38446\") " pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.374102 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.374054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbw2\" (UniqueName: \"kubernetes.io/projected/b9067034-59b4-4deb-b07b-6f07d382142d-kube-api-access-mxbw2\") pod \"tuned-jhjgp\" (UID: \"b9067034-59b4-4deb-b07b-6f07d382142d\") " pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.374198 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.374132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9h5\" (UniqueName: \"kubernetes.io/projected/ae8d7aa1-7c00-44df-9570-4435defaddc2-kube-api-access-lv9h5\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.375234 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.375155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm8cg\" (UniqueName: \"kubernetes.io/projected/db66438e-828b-4415-9a27-8eb8615c2db5-kube-api-access-tm8cg\") pod \"aws-ebs-csi-driver-node-45p7w\" (UID: \"db66438e-828b-4415-9a27-8eb8615c2db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.375714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.375691 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdqd\" (UniqueName: \"kubernetes.io/projected/b29cde9b-13b6-47a2-bdbe-68511210fa54-kube-api-access-rtdqd\") pod \"iptables-alerter-flbrp\" (UID: \"b29cde9b-13b6-47a2-bdbe-68511210fa54\") " pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.461175 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-cni-bin\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-conf-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-cni-multus\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461265 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-cni-bin\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-cni-multus\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vrdf\" (UniqueName: \"kubernetes.io/projected/3609ef37-ae6e-4910-8c8f-420611d9ef42-kube-api-access-5vrdf\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461294 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-conf-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461320 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-system-cni-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-cni-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-netns\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-kubelet\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-netns\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461482 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3609ef37-ae6e-4910-8c8f-420611d9ef42-cni-binary-copy\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-socket-dir-parent\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461528 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-cni-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-daemon-config\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461590 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-k8s-cni-cncf-io\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461678 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461705 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-cnibin\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-k8s-cni-cncf-io\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-hostroot\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461761 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-hostroot\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461761 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-system-cni-dir\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-etc-kubernetes\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461800 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-socket-dir-parent\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-var-lib-kubelet\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-etc-kubernetes\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461832 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-cnibin\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-os-release\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-cnibin\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461868 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-cnibin\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461904 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-os-release\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snns7\" (UniqueName: \"kubernetes.io/projected/0053bd34-312c-4064-8485-b10a2b3b16d7-kube-api-access-snns7\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-os-release\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.461994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-multus-certs\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-system-cni-dir\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-os-release\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462058 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3609ef37-ae6e-4910-8c8f-420611d9ef42-multus-daemon-config\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3609ef37-ae6e-4910-8c8f-420611d9ef42-host-run-multus-certs\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0053bd34-312c-4064-8485-b10a2b3b16d7-system-cni-dir\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0053bd34-312c-4064-8485-b10a2b3b16d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.462719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.462318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3609ef37-ae6e-4910-8c8f-420611d9ef42-cni-binary-copy\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.469387 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.469360 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.469387 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.469389 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.469544 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.469400 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.469544 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.469465 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.969451155 +0000 UTC m=+3.170161211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.470742 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.470697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vrdf\" (UniqueName: \"kubernetes.io/projected/3609ef37-ae6e-4910-8c8f-420611d9ef42-kube-api-access-5vrdf\") pod \"multus-l22xw\" (UID: \"3609ef37-ae6e-4910-8c8f-420611d9ef42\") " pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.471797 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.471773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snns7\" (UniqueName: \"kubernetes.io/projected/0053bd34-312c-4064-8485-b10a2b3b16d7-kube-api-access-snns7\") pod \"multus-additional-cni-plugins-ftsjl\" (UID: \"0053bd34-312c-4064-8485-b10a2b3b16d7\") " pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.549568 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.549533 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" Apr 16 19:53:58.556950 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.556929 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b88gw" Apr 16 19:53:58.559400 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.559378 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:58.565531 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.565514 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-flbrp" Apr 16 19:53:58.570199 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.570180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:53:58.576799 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.576784 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:53:58.583377 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.583361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" Apr 16 19:53:58.589954 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.589938 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" Apr 16 19:53:58.594498 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.594480 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l22xw" Apr 16 19:53:58.864394 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:58.864315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:58.864544 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.864467 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.864586 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:58.864551 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.864531026 +0000 UTC m=+4.065241086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.939780 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:58.939741 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0053bd34_312c_4064_8485_b10a2b3b16d7.slice/crio-6e3cb2a1033b5ffa8f72a417b1b82343175d8878a8118becd11edd5da3c51595 WatchSource:0}: Error finding container 6e3cb2a1033b5ffa8f72a417b1b82343175d8878a8118becd11edd5da3c51595: Status 404 returned error can't find the container with id 6e3cb2a1033b5ffa8f72a417b1b82343175d8878a8118becd11edd5da3c51595 Apr 16 19:53:58.940561 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:58.940529 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe1490d8_1e7e_4156_9765_d2fec8a38446.slice/crio-e711b0591598205f8f506c491a9e9534224b0bfffd4e27b983978e0fab154e06 WatchSource:0}: Error finding container e711b0591598205f8f506c491a9e9534224b0bfffd4e27b983978e0fab154e06: Status 404 returned error can't find the container with id e711b0591598205f8f506c491a9e9534224b0bfffd4e27b983978e0fab154e06 Apr 16 19:53:58.946399 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:58.946376 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb66438e_828b_4415_9a27_8eb8615c2db5.slice/crio-546ae5ab52a34a59c7dad94c05d99064411ec4a405ef187e2cd7c968a8b36837 WatchSource:0}: Error finding container 546ae5ab52a34a59c7dad94c05d99064411ec4a405ef187e2cd7c968a8b36837: Status 404 returned error can't find the container with id 546ae5ab52a34a59c7dad94c05d99064411ec4a405ef187e2cd7c968a8b36837 Apr 16 19:53:58.947853 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:58.947828 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05bff05d_2c89_41be_b7ea_03dd408b9294.slice/crio-da4bec1b60c65395807bdc63e309027640efc1fe8d26f3a19369135ee6428631 WatchSource:0}: Error finding container da4bec1b60c65395807bdc63e309027640efc1fe8d26f3a19369135ee6428631: Status 404 returned error can't find the container with id da4bec1b60c65395807bdc63e309027640efc1fe8d26f3a19369135ee6428631 Apr 16 19:53:58.948609 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:58.948587 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3609ef37_ae6e_4910_8c8f_420611d9ef42.slice/crio-d0c6de98eec154855cf61b105417bef6389e072a711d1e1f3d20adbcf9e7354a WatchSource:0}: Error finding container d0c6de98eec154855cf61b105417bef6389e072a711d1e1f3d20adbcf9e7354a: Status 404 returned error can't find the container with id d0c6de98eec154855cf61b105417bef6389e072a711d1e1f3d20adbcf9e7354a Apr 16 19:53:58.949719 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:53:58.949679 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267f5c95_39db_40d0_a78a_839da0347dfc.slice/crio-2422fa5855f15203ca16fa3b6e9268eb344f376b568f4bff85b7ab4e2f1d10af WatchSource:0}: Error finding container 2422fa5855f15203ca16fa3b6e9268eb344f376b568f4bff85b7ab4e2f1d10af: Status 404 returned error can't find the container with id 2422fa5855f15203ca16fa3b6e9268eb344f376b568f4bff85b7ab4e2f1d10af Apr 16 19:53:59.066115 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.065943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:53:59.066115 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:59.066100 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:59.066115 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:59.066118 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:59.066305 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:59.066128 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.066305 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:59.066176 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.066163189 +0000 UTC m=+4.266873251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.305112 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.304962 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:57 +0000 UTC" deadline="2028-01-01 00:45:32.565896488 +0000 UTC" Apr 16 19:53:59.305112 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.305002 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14980h51m33.260898298s" Apr 16 19:53:59.386993 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.386260 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" event={"ID":"027943e939a2d76cdb600f777d89968b","Type":"ContainerStarted","Data":"903bd39b47fb73fcc6bd89a9bd90f7ec7a1f5db711da9e3717ce06bd06702855"} Apr 16 19:53:59.391710 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.391650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"da4bec1b60c65395807bdc63e309027640efc1fe8d26f3a19369135ee6428631"} Apr 16 19:53:59.394974 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.394940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" event={"ID":"db66438e-828b-4415-9a27-8eb8615c2db5","Type":"ContainerStarted","Data":"546ae5ab52a34a59c7dad94c05d99064411ec4a405ef187e2cd7c968a8b36837"} Apr 16 19:53:59.403903 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.403864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b88gw" event={"ID":"fe1490d8-1e7e-4156-9765-d2fec8a38446","Type":"ContainerStarted","Data":"e711b0591598205f8f506c491a9e9534224b0bfffd4e27b983978e0fab154e06"} Apr 16 19:53:59.411245 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.411212 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerStarted","Data":"6e3cb2a1033b5ffa8f72a417b1b82343175d8878a8118becd11edd5da3c51595"} Apr 16 19:53:59.417033 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.416947 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lb4zl" event={"ID":"267f5c95-39db-40d0-a78a-839da0347dfc","Type":"ContainerStarted","Data":"2422fa5855f15203ca16fa3b6e9268eb344f376b568f4bff85b7ab4e2f1d10af"} Apr 16 19:53:59.426489 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.426413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l22xw" event={"ID":"3609ef37-ae6e-4910-8c8f-420611d9ef42","Type":"ContainerStarted","Data":"d0c6de98eec154855cf61b105417bef6389e072a711d1e1f3d20adbcf9e7354a"} Apr 16 19:53:59.430183 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.430152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" event={"ID":"b9067034-59b4-4deb-b07b-6f07d382142d","Type":"ContainerStarted","Data":"7832f9c0011636ebd86c3fdadb7308d89f621e174c8a7e3ee3051bafa30b2180"} Apr 16 19:53:59.436303 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.436269 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-flbrp" event={"ID":"b29cde9b-13b6-47a2-bdbe-68511210fa54","Type":"ContainerStarted","Data":"7d453bc2d5efd27e3429dd04793b75d9f21ba7ccec04373a5ef8f74c134769a4"} Apr 16 19:53:59.883748 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:53:59.882455 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:53:59.883748 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:59.882609 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:59.883748 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:53:59.882681 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.882659969 +0000 UTC m=+6.083370031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:00.084977 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:00.084366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:00.084977 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:00.084520 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:00.084977 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:00.084540 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:00.084977 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:00.084552 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.084977 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:00.084610 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.084592493 +0000 UTC m=+6.285302552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.370040 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:00.369994 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:00.370460 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:00.370137 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:00.370606 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:00.370589 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:00.370716 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:00.370695 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:00.445681 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:00.445646 2569 generic.go:358] "Generic (PLEG): container finished" podID="1427eed4f2cd472349fa20b6f1cf215c" containerID="39d53ed4ba594444731ff43e3d2c008a40b893f00154ec7e6d0f432990bd1d8b" exitCode=0 Apr 16 19:54:00.446529 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:00.446481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" event={"ID":"1427eed4f2cd472349fa20b6f1cf215c","Type":"ContainerDied","Data":"39d53ed4ba594444731ff43e3d2c008a40b893f00154ec7e6d0f432990bd1d8b"} Apr 16 19:54:00.461116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:00.461075 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" podStartSLOduration=3.46106163 podStartE2EDuration="3.46106163s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:53:59.422491501 +0000 UTC m=+3.623201579" watchObservedRunningTime="2026-04-16 19:54:00.46106163 +0000 UTC m=+4.661771704" Apr 16 19:54:01.451912 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:01.451871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" event={"ID":"1427eed4f2cd472349fa20b6f1cf215c","Type":"ContainerStarted","Data":"2b67605232f2b772f769882f2a6c465a8677e491743ffb6968d7a61bc7238be4"} Apr 16 19:54:01.900520 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:01.900434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:01.900709 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:01.900685 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.900780 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:01.900767 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.900744526 +0000 UTC m=+10.101454597 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.101794 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:02.101758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:02.101971 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:02.101944 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:02.101971 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:02.101964 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:02.102155 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:02.101977 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.102155 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:02.102056 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:06.102038048 +0000 UTC m=+10.302748108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.368427 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:02.367097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:02.368427 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:02.367235 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:02.368427 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:02.367102 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:02.368427 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:02.367671 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:04.367471 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:04.367427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:04.367924 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:04.367569 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:04.367924 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:04.367661 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:04.367924 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:04.367808 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:05.936297 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:05.936262 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:05.936725 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:05.936458 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:05.936725 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:05.936526 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.936506209 +0000 UTC m=+18.137216278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:06.138241 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:06.138099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:06.138413 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:06.138261 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:06.138413 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:06.138279 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:06.138413 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:06.138291 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:06.138413 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:06.138352 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.138335132 +0000 UTC m=+18.339045189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:06.367780 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:06.367662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:06.367947 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:06.367776 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:06.367947 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:06.367842 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:06.368087 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:06.367958 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:08.367404 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:08.367371 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:08.367839 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:08.367385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:08.367839 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:08.367503 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:08.367839 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:08.367616 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:10.366807 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:10.366773 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:10.367278 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:10.366902 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:10.367278 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:10.366960 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:10.367278 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:10.367138 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:12.367658 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:12.367619 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:12.368136 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:12.367760 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:12.368136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:12.367820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:12.368136 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:12.367928 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:13.995903 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:13.995869 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:13.996333 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:13.995994 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:13.996333 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:13.996062 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.9960473 +0000 UTC m=+34.196757361 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:14.197478 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:14.197440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:14.197613 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:14.197597 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:14.197668 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:14.197620 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:14.197668 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:14.197633 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:14.197729 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:14.197693 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.197675844 +0000 UTC m=+34.398385923 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:14.367708 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:14.367629 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:14.367850 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:14.367639 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:14.367850 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:14.367757 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:14.367927 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:14.367871 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:16.368000 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.367669 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:16.368713 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.367714 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:16.368713 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:16.368097 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:16.368713 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:16.368164 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:16.479922 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.479839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l22xw" event={"ID":"3609ef37-ae6e-4910-8c8f-420611d9ef42","Type":"ContainerStarted","Data":"3671f001b3ac3a32db0b999e6a577c177c60aef9d5596de193eafb59ddb4d81c"} Apr 16 19:54:16.481398 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.481369 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" event={"ID":"b9067034-59b4-4deb-b07b-6f07d382142d","Type":"ContainerStarted","Data":"d02b45b887ebe4a789912b2460a06963fbf045dc4164342dbc17625f397ad49d"} Apr 16 19:54:16.486822 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.486799 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 19:54:16.487202 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487181 2569 generic.go:358] "Generic (PLEG): container finished" podID="05bff05d-2c89-41be-b7ea-03dd408b9294" containerID="df975e3990c24988dfd410b46a72d335f0378379a19735dc77851110b8cf4d82" exitCode=1 Apr 16 19:54:16.487291 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487251 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"04bc4651177c9db3b46c9505edf85d8b014ce78e44226a340a027ad15d221b2f"} Apr 16 19:54:16.487291 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487287 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"20ac1e3b4366ee5b05f82b2fabe936d6713aff123695892ec6727ab3a18ec2a4"} Apr 16 19:54:16.487406 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"bcaa304f2d1317dd86819c17affc9d534f0bb61eb52feacae8286c9319a14567"} Apr 16 19:54:16.487406 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"5ecaeb2aed3e4e4e4fb41485fd62c3cf3851398fa068a627ffef47576d2377c0"} Apr 16 19:54:16.487406 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerDied","Data":"df975e3990c24988dfd410b46a72d335f0378379a19735dc77851110b8cf4d82"} Apr 16 19:54:16.487406 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.487340 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"d100c5a8d95da2e4ae2e75a8c79a0c27f6d59a2a6b26864b46a2eb61c419bd8d"} Apr 16 19:54:16.488678 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.488654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" event={"ID":"db66438e-828b-4415-9a27-8eb8615c2db5","Type":"ContainerStarted","Data":"10b998fac9d2d47f8f847f19edc036925305ec3658e133710e164ae6b2bc4dc7"} Apr 16 19:54:16.489956 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.489927 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b88gw" event={"ID":"fe1490d8-1e7e-4156-9765-d2fec8a38446","Type":"ContainerStarted","Data":"9d8d2196126e15646a9f6707b0e8db683f8a7487faed8af6a1d8d98a37b761a6"} Apr 16 19:54:16.491416 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.491391 2569 generic.go:358] "Generic (PLEG): container finished" podID="0053bd34-312c-4064-8485-b10a2b3b16d7" containerID="e62ad1daaa92baa556589e9a6e81e6afb67db38bb6eaad9835ca19613bf99157" exitCode=0 Apr 16 19:54:16.491505 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.491457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerDied","Data":"e62ad1daaa92baa556589e9a6e81e6afb67db38bb6eaad9835ca19613bf99157"} Apr 16 19:54:16.492758 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.492737 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lb4zl" event={"ID":"267f5c95-39db-40d0-a78a-839da0347dfc","Type":"ContainerStarted","Data":"8c52d0c0a40c6976d385365d16f726c04d18942bbd298d07f77694c6d667ad65"} Apr 16 19:54:16.499578 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.499545 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" podStartSLOduration=19.499518198 podStartE2EDuration="19.499518198s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:01.466153438 +0000 UTC m=+5.666863519" watchObservedRunningTime="2026-04-16 19:54:16.499518198 +0000 UTC m=+20.700228284" Apr 16 19:54:16.499823 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.499790 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-l22xw" podStartSLOduration=3.61437662 podStartE2EDuration="20.499780033s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.950502679 +0000 UTC m=+3.151212741" lastFinishedPulling="2026-04-16 19:54:15.835906083 +0000 UTC m=+20.036616154" observedRunningTime="2026-04-16 19:54:16.499775555 +0000 UTC m=+20.700485634" watchObservedRunningTime="2026-04-16 19:54:16.499780033 +0000 UTC m=+20.700490112" Apr 16 19:54:16.517127 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.517089 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jhjgp" podStartSLOduration=3.8390704490000003 podStartE2EDuration="20.51707611s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.945646299 +0000 UTC m=+3.146356548" lastFinishedPulling="2026-04-16 19:54:15.623652146 +0000 UTC m=+19.824362209" observedRunningTime="2026-04-16 19:54:16.516692549 +0000 UTC m=+20.717402627" watchObservedRunningTime="2026-04-16 19:54:16.51707611 +0000 UTC m=+20.717786190" Apr 16 19:54:16.547171 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.547136 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b88gw" podStartSLOduration=4.095134193 podStartE2EDuration="20.547120952s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.941921555 +0000 UTC m=+3.142631625" lastFinishedPulling="2026-04-16 19:54:15.393908314 +0000 UTC m=+19.594618384" observedRunningTime="2026-04-16 19:54:16.546584156 +0000 UTC m=+20.747294235" watchObservedRunningTime="2026-04-16 19:54:16.547120952 +0000 UTC m=+20.747831031" Apr 16 19:54:16.560807 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.559925 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lb4zl" podStartSLOduration=8.429568424 podStartE2EDuration="20.559910101s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.951682569 +0000 UTC m=+3.152392625" lastFinishedPulling="2026-04-16 19:54:11.08202423 +0000 UTC m=+15.282734302" observedRunningTime="2026-04-16 19:54:16.559332473 +0000 UTC m=+20.760042554" watchObservedRunningTime="2026-04-16 19:54:16.559910101 +0000 UTC m=+20.760620181" Apr 16 19:54:16.699719 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:16.699699 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:17.314995 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.314835 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:16.699714984Z","UUID":"7889d64b-60c0-4d44-98b5-ad7819c3fbe4","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:17.316961 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.316936 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:17.317117 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.316967 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:17.496441 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.496400 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-flbrp" event={"ID":"b29cde9b-13b6-47a2-bdbe-68511210fa54","Type":"ContainerStarted","Data":"9ec767d6cac3f5416ba37a6ff6e9781920bb4ff497668ada6f37ed4982af9648"} Apr 16 19:54:17.498227 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.498193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" event={"ID":"db66438e-828b-4415-9a27-8eb8615c2db5","Type":"ContainerStarted","Data":"f062cbb766a246abf2ad4a6442a6b68eaf6a59567b43ac07c9355cf747d83263"} Apr 16 19:54:17.510144 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.510104 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-flbrp" podStartSLOduration=4.830008502 podStartE2EDuration="21.510090115s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.943478682 +0000 UTC m=+3.144188752" lastFinishedPulling="2026-04-16 19:54:15.623560298 +0000 UTC m=+19.824270365" observedRunningTime="2026-04-16 19:54:17.509731046 +0000 UTC m=+21.710441128" watchObservedRunningTime="2026-04-16 19:54:17.510090115 +0000 UTC m=+21.710800193" Apr 16 19:54:17.995598 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.995558 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:54:17.996301 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:17.996286 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:54:18.367683 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:18.367604 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:18.367874 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:18.367743 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:18.367874 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:18.367607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:18.367874 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:18.367846 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:18.503182 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:18.503150 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 19:54:18.503643 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:18.503490 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"07e049ac78ea844f3f78ce57687850c70cc9876ee39853f6e25d1330e13fa919"} Apr 16 19:54:18.505401 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:18.505336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" event={"ID":"db66438e-828b-4415-9a27-8eb8615c2db5","Type":"ContainerStarted","Data":"f2ac0a34a793f325f6a162585fcd6867c3a44a050703780fd696ab939863b3e0"} Apr 16 19:54:18.521819 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:18.521770 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45p7w" podStartSLOduration=3.874101484 podStartE2EDuration="22.52175558s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.948418314 +0000 UTC m=+3.149128374" lastFinishedPulling="2026-04-16 19:54:17.596072411 +0000 UTC m=+21.796782470" observedRunningTime="2026-04-16 19:54:18.521344162 +0000 UTC m=+22.722054239" watchObservedRunningTime="2026-04-16 19:54:18.52175558 +0000 UTC m=+22.722465660" Apr 16 19:54:19.507630 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:19.507601 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:20.367117 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.366934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:20.367303 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.366934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:20.367303 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:20.367254 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:20.367420 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:20.367339 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:20.484881 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.484866 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cz4rh"] Apr 16 19:54:20.508713 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.508692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.510894 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.510875 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j2vdp\"" Apr 16 19:54:20.510960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.510939 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:20.514132 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.513994 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 19:54:20.514561 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.514431 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"22ab83546f634975373b7dc57bd85ce8b47c1994f55e6eff225beb36a9ce8cbd"} Apr 16 19:54:20.517511 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.517318 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:20.641685 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.641649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srgwm\" (UniqueName: \"kubernetes.io/projected/3ac8b4fc-9197-47f1-9c7f-794db0590349-kube-api-access-srgwm\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.641820 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.641700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ac8b4fc-9197-47f1-9c7f-794db0590349-hosts-file\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.641820 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.641800 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac8b4fc-9197-47f1-9c7f-794db0590349-tmp-dir\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.742906 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.742882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac8b4fc-9197-47f1-9c7f-794db0590349-tmp-dir\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.743062 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.742920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srgwm\" (UniqueName: \"kubernetes.io/projected/3ac8b4fc-9197-47f1-9c7f-794db0590349-kube-api-access-srgwm\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.743062 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.742944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ac8b4fc-9197-47f1-9c7f-794db0590349-hosts-file\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.743134 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.743061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ac8b4fc-9197-47f1-9c7f-794db0590349-hosts-file\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.743245 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.743229 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac8b4fc-9197-47f1-9c7f-794db0590349-tmp-dir\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.751620 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.751598 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srgwm\" (UniqueName: \"kubernetes.io/projected/3ac8b4fc-9197-47f1-9c7f-794db0590349-kube-api-access-srgwm\") pod \"node-resolver-cz4rh\" (UID: \"3ac8b4fc-9197-47f1-9c7f-794db0590349\") " pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:20.819599 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:20.819427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cz4rh" Apr 16 19:54:21.517800 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.517768 2569 generic.go:358] "Generic (PLEG): container finished" podID="0053bd34-312c-4064-8485-b10a2b3b16d7" containerID="3d58dc399d71dbb37559d0523c10b2a9acd28624b3582407edba83bcd1a022bd" exitCode=0 Apr 16 19:54:21.518417 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.517834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerDied","Data":"3d58dc399d71dbb37559d0523c10b2a9acd28624b3582407edba83bcd1a022bd"} Apr 16 19:54:21.519111 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.519088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cz4rh" event={"ID":"3ac8b4fc-9197-47f1-9c7f-794db0590349","Type":"ContainerStarted","Data":"f49f7aad4f06a94718399d7486eca009f6d604f0c00fea63556acf5339915ea0"} Apr 16 19:54:21.519204 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.519132 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cz4rh" event={"ID":"3ac8b4fc-9197-47f1-9c7f-794db0590349","Type":"ContainerStarted","Data":"4e952e579cc7c3081e0d3028aa3a524aa4f2eeafa195b017b98d81b50c94aeb6"} Apr 16 19:54:21.519514 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.519493 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:54:21.519631 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.519617 2569 scope.go:117] "RemoveContainer" containerID="df975e3990c24988dfd410b46a72d335f0378379a19735dc77851110b8cf4d82" Apr 16 19:54:21.534615 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.534593 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:54:21.566883 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:21.566845 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cz4rh" podStartSLOduration=1.56683156 podStartE2EDuration="1.56683156s" podCreationTimestamp="2026-04-16 19:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:21.566618886 +0000 UTC m=+25.767328961" watchObservedRunningTime="2026-04-16 19:54:21.56683156 +0000 UTC m=+25.767541637" Apr 16 19:54:22.369840 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.369816 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:22.369994 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.369816 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:22.369994 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:22.369905 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:22.370088 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:22.369999 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:22.523486 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.523466 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 19:54:22.523795 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.523771 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" event={"ID":"05bff05d-2c89-41be-b7ea-03dd408b9294","Type":"ContainerStarted","Data":"8dc32580cb751022c86371e3985f866179d6d75ff570bc97765bfe76ee265a51"} Apr 16 19:54:22.523886 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.523873 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:22.524164 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.524115 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:54:22.525742 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.525718 2569 generic.go:358] "Generic (PLEG): container finished" podID="0053bd34-312c-4064-8485-b10a2b3b16d7" containerID="5fff5ff07920da38cd98ec1af5f042708093ca2ac9b68b227d276d7b0b4300b1" exitCode=0 Apr 16 19:54:22.525827 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.525757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerDied","Data":"5fff5ff07920da38cd98ec1af5f042708093ca2ac9b68b227d276d7b0b4300b1"} Apr 16 19:54:22.540173 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.540053 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:54:22.558846 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.558811 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" podStartSLOduration=9.822001133 podStartE2EDuration="26.55880074s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.949886351 +0000 UTC m=+3.150596422" lastFinishedPulling="2026-04-16 19:54:15.686685968 +0000 UTC m=+19.887396029" observedRunningTime="2026-04-16 19:54:22.55762197 +0000 UTC m=+26.758332049" watchObservedRunningTime="2026-04-16 19:54:22.55880074 +0000 UTC m=+26.759510819" Apr 16 19:54:22.636559 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.636539 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:54:22.636647 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.636632 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:22.637082 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.637064 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lb4zl" Apr 16 19:54:22.676795 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.676770 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dk2hk"] Apr 16 19:54:22.676909 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.676864 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:22.676954 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:22.676930 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:22.684534 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.684515 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bkkfh"] Apr 16 19:54:22.684616 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:22.684600 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:22.684690 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:22.684676 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:23.530170 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:23.530134 2569 generic.go:358] "Generic (PLEG): container finished" podID="0053bd34-312c-4064-8485-b10a2b3b16d7" containerID="257aea165908928b441d1cf3828bf936fcc0dfa4d1239c8d65103343fa693869" exitCode=0 Apr 16 19:54:23.530651 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:23.530182 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerDied","Data":"257aea165908928b441d1cf3828bf936fcc0dfa4d1239c8d65103343fa693869"} Apr 16 19:54:23.530651 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:23.530434 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:24.367328 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:24.367297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:24.367328 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:24.367309 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:24.367546 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:24.367417 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:24.367546 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:24.367527 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:24.532154 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:24.532122 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:25.151545 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:25.151512 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:54:25.162229 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:25.162174 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" podUID="05bff05d-2c89-41be-b7ea-03dd408b9294" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 19:54:25.172917 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:25.172881 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" podUID="05bff05d-2c89-41be-b7ea-03dd408b9294" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 19:54:25.545319 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:25.545280 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" podUID="05bff05d-2c89-41be-b7ea-03dd408b9294" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 19:54:26.368874 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:26.368843 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:26.369057 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:26.368951 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:26.369343 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:26.369197 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:26.369343 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:26.369309 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:28.367364 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.367323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:28.367921 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:28.367460 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dk2hk" podUID="33559bf7-25ac-4de7-a712-253f87279cbf" Apr 16 19:54:28.367921 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.367518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:28.367921 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:28.367757 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkkfh" podUID="ae8d7aa1-7c00-44df-9570-4435defaddc2" Apr 16 19:54:28.605695 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.605621 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeReady" Apr 16 19:54:28.605838 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.605772 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:28.665988 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.665953 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vslwr"] Apr 16 19:54:28.702201 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.702157 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9662b"] Apr 16 19:54:28.702374 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.702338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.704722 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.704677 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lx244\"" Apr 16 19:54:28.704849 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.704805 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:28.704986 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.704938 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:28.718177 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.718147 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vslwr"] Apr 16 19:54:28.718292 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.718191 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9662b"] Apr 16 19:54:28.718292 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.718214 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:28.720402 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.720382 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:28.720715 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.720697 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.720795 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.720754 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6lbj8\"" Apr 16 19:54:28.720795 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.720793 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.805780 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.805731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.805780 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.805786 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/731a4df4-ce0e-4549-8b0a-37f41083e8a3-config-volume\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.806027 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.805846 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc5xc\" (UniqueName: \"kubernetes.io/projected/731a4df4-ce0e-4549-8b0a-37f41083e8a3-kube-api-access-gc5xc\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.806027 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.805896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:28.806027 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.805973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6lw\" (UniqueName: \"kubernetes.io/projected/e4e2129f-d37a-4277-bfd3-5be4dbf86524-kube-api-access-cp6lw\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:28.806173 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.806029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/731a4df4-ce0e-4549-8b0a-37f41083e8a3-tmp-dir\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.906554 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.906473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6lw\" (UniqueName: \"kubernetes.io/projected/e4e2129f-d37a-4277-bfd3-5be4dbf86524-kube-api-access-cp6lw\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:28.906763 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.906632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/731a4df4-ce0e-4549-8b0a-37f41083e8a3-tmp-dir\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.906763 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.906697 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.906763 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.906732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/731a4df4-ce0e-4549-8b0a-37f41083e8a3-config-volume\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.906763 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.906752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc5xc\" (UniqueName: \"kubernetes.io/projected/731a4df4-ce0e-4549-8b0a-37f41083e8a3-kube-api-access-gc5xc\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.906968 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.906776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:28.906968 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:28.906919 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:28.906968 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:28.906923 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:28.907183 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:28.906969 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert podName:e4e2129f-d37a-4277-bfd3-5be4dbf86524 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.406952065 +0000 UTC m=+33.607662125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert") pod "ingress-canary-9662b" (UID: "e4e2129f-d37a-4277-bfd3-5be4dbf86524") : secret "canary-serving-cert" not found Apr 16 19:54:28.907183 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:28.907000 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls podName:731a4df4-ce0e-4549-8b0a-37f41083e8a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.406981388 +0000 UTC m=+33.607691453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls") pod "dns-default-vslwr" (UID: "731a4df4-ce0e-4549-8b0a-37f41083e8a3") : secret "dns-default-metrics-tls" not found Apr 16 19:54:28.907183 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.907039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/731a4df4-ce0e-4549-8b0a-37f41083e8a3-tmp-dir\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.907435 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.907412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/731a4df4-ce0e-4549-8b0a-37f41083e8a3-config-volume\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.918304 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.918174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc5xc\" (UniqueName: \"kubernetes.io/projected/731a4df4-ce0e-4549-8b0a-37f41083e8a3-kube-api-access-gc5xc\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:28.918413 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:28.918176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6lw\" (UniqueName: \"kubernetes.io/projected/e4e2129f-d37a-4277-bfd3-5be4dbf86524-kube-api-access-cp6lw\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:29.410087 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:29.410052 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:29.410087 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:29.410093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:29.410482 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:29.410190 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:29.410482 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:29.410196 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:29.410482 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:29.410253 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert podName:e4e2129f-d37a-4277-bfd3-5be4dbf86524 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.410238223 +0000 UTC m=+34.610948279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert") pod "ingress-canary-9662b" (UID: "e4e2129f-d37a-4277-bfd3-5be4dbf86524") : secret "canary-serving-cert" not found Apr 16 19:54:29.410482 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:29.410265 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls podName:731a4df4-ce0e-4549-8b0a-37f41083e8a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.410259905 +0000 UTC m=+34.610969961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls") pod "dns-default-vslwr" (UID: "731a4df4-ce0e-4549-8b0a-37f41083e8a3") : secret "dns-default-metrics-tls" not found Apr 16 19:54:30.013686 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.013597 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:30.013838 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.013742 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:30.013838 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.013816 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs podName:ae8d7aa1-7c00-44df-9570-4435defaddc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:02.013799806 +0000 UTC m=+66.214509873 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs") pod "network-metrics-daemon-bkkfh" (UID: "ae8d7aa1-7c00-44df-9570-4435defaddc2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:30.215344 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.215311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:30.215502 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.215452 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:30.215502 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.215468 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:30.215502 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.215478 2569 projected.go:194] Error preparing data for projected volume kube-api-access-h7czr for pod openshift-network-diagnostics/network-check-target-dk2hk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:30.215600 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.215535 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr podName:33559bf7-25ac-4de7-a712-253f87279cbf nodeName:}" failed. No retries permitted until 2026-04-16 19:55:02.215517881 +0000 UTC m=+66.416227938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-h7czr" (UniqueName: "kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr") pod "network-check-target-dk2hk" (UID: "33559bf7-25ac-4de7-a712-253f87279cbf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:30.367077 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.366972 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:54:30.367077 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.366973 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:54:30.373043 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.372766 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:30.373043 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.372802 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-25nwv\"" Apr 16 19:54:30.373043 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.372840 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2m6x\"" Apr 16 19:54:30.373043 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.372770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:30.373043 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.372773 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:30.415968 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.415937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:30.416320 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.415977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:30.416320 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.416086 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:30.416320 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.416106 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:30.416320 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.416136 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert podName:e4e2129f-d37a-4277-bfd3-5be4dbf86524 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.416123859 +0000 UTC m=+36.616833915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert") pod "ingress-canary-9662b" (UID: "e4e2129f-d37a-4277-bfd3-5be4dbf86524") : secret "canary-serving-cert" not found Apr 16 19:54:30.416320 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:30.416158 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls podName:731a4df4-ce0e-4549-8b0a-37f41083e8a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.416145321 +0000 UTC m=+36.616855377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls") pod "dns-default-vslwr" (UID: "731a4df4-ce0e-4549-8b0a-37f41083e8a3") : secret "dns-default-metrics-tls" not found Apr 16 19:54:30.544811 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.544777 2569 generic.go:358] "Generic (PLEG): container finished" podID="0053bd34-312c-4064-8485-b10a2b3b16d7" containerID="ff5bd7c8f31c0db42496c2d679fa5412c56acc93060753427beec9f0e0a6de23" exitCode=0 Apr 16 19:54:30.544968 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:30.544823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerDied","Data":"ff5bd7c8f31c0db42496c2d679fa5412c56acc93060753427beec9f0e0a6de23"} Apr 16 19:54:31.549452 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:31.549418 2569 generic.go:358] "Generic (PLEG): container finished" podID="0053bd34-312c-4064-8485-b10a2b3b16d7" containerID="c760c533375594aa629a41ffeacae1b112cd2535019c9d06d755baf0e823c6b1" exitCode=0 Apr 16 19:54:31.549809 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:31.549471 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerDied","Data":"c760c533375594aa629a41ffeacae1b112cd2535019c9d06d755baf0e823c6b1"} Apr 16 19:54:32.431914 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:32.431882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:32.432075 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:32.431924 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:32.432075 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:32.432040 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:32.432075 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:32.432065 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:32.432206 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:32.432105 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls podName:731a4df4-ce0e-4549-8b0a-37f41083e8a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.432089576 +0000 UTC m=+40.632799636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls") pod "dns-default-vslwr" (UID: "731a4df4-ce0e-4549-8b0a-37f41083e8a3") : secret "dns-default-metrics-tls" not found Apr 16 19:54:32.432206 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:32.432119 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert podName:e4e2129f-d37a-4277-bfd3-5be4dbf86524 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.432113274 +0000 UTC m=+40.632823331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert") pod "ingress-canary-9662b" (UID: "e4e2129f-d37a-4277-bfd3-5be4dbf86524") : secret "canary-serving-cert" not found Apr 16 19:54:32.554537 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:32.554507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" event={"ID":"0053bd34-312c-4064-8485-b10a2b3b16d7","Type":"ContainerStarted","Data":"a09b633ff7fc1a8bf42396071758cfc07603540c6b1a6ae64f4e923b505df1a1"} Apr 16 19:54:32.583990 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:32.583943 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ftsjl" podStartSLOduration=5.92340939 podStartE2EDuration="36.583927518s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.941722143 +0000 UTC m=+3.142432203" lastFinishedPulling="2026-04-16 19:54:29.602240258 +0000 UTC m=+33.802950331" observedRunningTime="2026-04-16 19:54:32.581726102 +0000 UTC m=+36.782436177" watchObservedRunningTime="2026-04-16 19:54:32.583927518 +0000 UTC m=+36.784637596" Apr 16 19:54:36.458755 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:36.458718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:36.458755 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:36.458764 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:36.459301 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:36.458890 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:36.459301 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:36.458935 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert podName:e4e2129f-d37a-4277-bfd3-5be4dbf86524 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.458922289 +0000 UTC m=+48.659632345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert") pod "ingress-canary-9662b" (UID: "e4e2129f-d37a-4277-bfd3-5be4dbf86524") : secret "canary-serving-cert" not found Apr 16 19:54:36.459301 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:36.458888 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:36.459301 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:36.459058 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls podName:731a4df4-ce0e-4549-8b0a-37f41083e8a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.459036814 +0000 UTC m=+48.659746883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls") pod "dns-default-vslwr" (UID: "731a4df4-ce0e-4549-8b0a-37f41083e8a3") : secret "dns-default-metrics-tls" not found Apr 16 19:54:44.512666 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:44.512626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:54:44.512666 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:44.512668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:54:44.513175 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:44.512767 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:44.513175 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:44.512769 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:44.513175 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:44.512818 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert podName:e4e2129f-d37a-4277-bfd3-5be4dbf86524 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.512802785 +0000 UTC m=+64.713512842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert") pod "ingress-canary-9662b" (UID: "e4e2129f-d37a-4277-bfd3-5be4dbf86524") : secret "canary-serving-cert" not found Apr 16 19:54:44.513175 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:54:44.512845 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls podName:731a4df4-ce0e-4549-8b0a-37f41083e8a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.512830737 +0000 UTC m=+64.713540792 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls") pod "dns-default-vslwr" (UID: "731a4df4-ce0e-4549-8b0a-37f41083e8a3") : secret "dns-default-metrics-tls" not found Apr 16 19:54:46.077127 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.077091 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2"] Apr 16 19:54:46.095760 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.095729 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" Apr 16 19:54:46.095895 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.095771 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2"] Apr 16 19:54:46.098861 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.098828 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:46.099150 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.099131 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 19:54:46.099835 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.099814 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7qclc\"" Apr 16 19:54:46.225519 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.225485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5t6z\" (UniqueName: \"kubernetes.io/projected/37649370-90f5-4614-8ee1-1d712cdc28ee-kube-api-access-x5t6z\") pod \"migrator-74bb7799d9-8gcb2\" (UID: \"37649370-90f5-4614-8ee1-1d712cdc28ee\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" Apr 16 19:54:46.326603 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.326578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5t6z\" (UniqueName: \"kubernetes.io/projected/37649370-90f5-4614-8ee1-1d712cdc28ee-kube-api-access-x5t6z\") pod \"migrator-74bb7799d9-8gcb2\" (UID: \"37649370-90f5-4614-8ee1-1d712cdc28ee\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" Apr 16 19:54:46.335595 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.335543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5t6z\" (UniqueName: \"kubernetes.io/projected/37649370-90f5-4614-8ee1-1d712cdc28ee-kube-api-access-x5t6z\") pod \"migrator-74bb7799d9-8gcb2\" (UID: \"37649370-90f5-4614-8ee1-1d712cdc28ee\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" Apr 16 19:54:46.404963 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.404935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" Apr 16 19:54:46.539533 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.539506 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2"] Apr 16 19:54:46.542579 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:54:46.542553 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37649370_90f5_4614_8ee1_1d712cdc28ee.slice/crio-a81107a42f3e0648a72784fcd5d0ea3b19a1f654041b168ed48f34d4af6142d7 WatchSource:0}: Error finding container a81107a42f3e0648a72784fcd5d0ea3b19a1f654041b168ed48f34d4af6142d7: Status 404 returned error can't find the container with id a81107a42f3e0648a72784fcd5d0ea3b19a1f654041b168ed48f34d4af6142d7 Apr 16 19:54:46.581850 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:46.581813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" event={"ID":"37649370-90f5-4614-8ee1-1d712cdc28ee","Type":"ContainerStarted","Data":"a81107a42f3e0648a72784fcd5d0ea3b19a1f654041b168ed48f34d4af6142d7"} Apr 16 19:54:47.912996 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:47.912968 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cz4rh_3ac8b4fc-9197-47f1-9c7f-794db0590349/dns-node-resolver/0.log" Apr 16 19:54:48.469310 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.469278 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ngtdk"] Apr 16 19:54:48.472042 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.472028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.474480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.474430 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 19:54:48.474583 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.474548 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 19:54:48.474731 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.474716 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 19:54:48.475451 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.475434 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-tdznq\"" Apr 16 19:54:48.475451 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.475447 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 19:54:48.483068 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.483050 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ngtdk"] Apr 16 19:54:48.587354 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.587317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" event={"ID":"37649370-90f5-4614-8ee1-1d712cdc28ee","Type":"ContainerStarted","Data":"1613a025457fcde21bba38902b20dce027497e14adde6e99b2fbba666526cefd"} Apr 16 19:54:48.587354 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.587354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" event={"ID":"37649370-90f5-4614-8ee1-1d712cdc28ee","Type":"ContainerStarted","Data":"32d654deeaecb9a1dc6dd943c201ef6164f652e1f9debe1df33621c8e2916159"} Apr 16 19:54:48.603236 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.603192 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8gcb2" podStartSLOduration=1.54598965 podStartE2EDuration="2.603179413s" podCreationTimestamp="2026-04-16 19:54:46 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.544335016 +0000 UTC m=+50.745045072" lastFinishedPulling="2026-04-16 19:54:47.601524779 +0000 UTC m=+51.802234835" observedRunningTime="2026-04-16 19:54:48.602206369 +0000 UTC m=+52.802916449" watchObservedRunningTime="2026-04-16 19:54:48.603179413 +0000 UTC m=+52.803889490" Apr 16 19:54:48.640847 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.640819 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47ca9119-4209-4cc7-9375-294f63eb941d-signing-cabundle\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.640929 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.640850 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scc49\" (UniqueName: \"kubernetes.io/projected/47ca9119-4209-4cc7-9375-294f63eb941d-kube-api-access-scc49\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.640929 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.640883 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47ca9119-4209-4cc7-9375-294f63eb941d-signing-key\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.741844 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.741766 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47ca9119-4209-4cc7-9375-294f63eb941d-signing-key\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.741968 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.741910 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47ca9119-4209-4cc7-9375-294f63eb941d-signing-cabundle\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.741968 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.741938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scc49\" (UniqueName: \"kubernetes.io/projected/47ca9119-4209-4cc7-9375-294f63eb941d-kube-api-access-scc49\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.742641 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.742618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47ca9119-4209-4cc7-9375-294f63eb941d-signing-cabundle\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.744335 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.744310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47ca9119-4209-4cc7-9375-294f63eb941d-signing-key\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.749584 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.749556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scc49\" (UniqueName: \"kubernetes.io/projected/47ca9119-4209-4cc7-9375-294f63eb941d-kube-api-access-scc49\") pod \"service-ca-865cb79987-ngtdk\" (UID: \"47ca9119-4209-4cc7-9375-294f63eb941d\") " pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.780384 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.780357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ngtdk" Apr 16 19:54:48.891743 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.891702 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ngtdk"] Apr 16 19:54:48.894574 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:54:48.894540 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ca9119_4209_4cc7_9375_294f63eb941d.slice/crio-96a65e5dd552a71f2a0c8493fdc6ec52c7a151c30d2de43f0b01f3c6cae73164 WatchSource:0}: Error finding container 96a65e5dd552a71f2a0c8493fdc6ec52c7a151c30d2de43f0b01f3c6cae73164: Status 404 returned error can't find the container with id 96a65e5dd552a71f2a0c8493fdc6ec52c7a151c30d2de43f0b01f3c6cae73164 Apr 16 19:54:48.913804 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:48.913784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b88gw_fe1490d8-1e7e-4156-9765-d2fec8a38446/node-ca/0.log" Apr 16 19:54:49.590761 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:49.590722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ngtdk" event={"ID":"47ca9119-4209-4cc7-9375-294f63eb941d","Type":"ContainerStarted","Data":"96a65e5dd552a71f2a0c8493fdc6ec52c7a151c30d2de43f0b01f3c6cae73164"} Apr 16 19:54:50.595158 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:50.595128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ngtdk" event={"ID":"47ca9119-4209-4cc7-9375-294f63eb941d","Type":"ContainerStarted","Data":"e00c03db802c909b0f7d7d66b1385929b4205fdb5173b73db3141e873f23607f"} Apr 16 19:54:50.629726 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:50.629678 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-ngtdk" podStartSLOduration=1.023767189 podStartE2EDuration="2.629662689s" podCreationTimestamp="2026-04-16 19:54:48 +0000 UTC" firstStartedPulling="2026-04-16 19:54:48.896333413 +0000 UTC m=+53.097043468" lastFinishedPulling="2026-04-16 19:54:50.502228907 +0000 UTC m=+54.702938968" observedRunningTime="2026-04-16 19:54:50.629081185 +0000 UTC m=+54.829791264" watchObservedRunningTime="2026-04-16 19:54:50.629662689 +0000 UTC m=+54.830372790" Apr 16 19:54:55.543942 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:54:55.543914 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt9lg" Apr 16 19:55:00.522962 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.522917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:55:00.523446 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.522992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:55:00.525264 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.525234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/731a4df4-ce0e-4549-8b0a-37f41083e8a3-metrics-tls\") pod \"dns-default-vslwr\" (UID: \"731a4df4-ce0e-4549-8b0a-37f41083e8a3\") " pod="openshift-dns/dns-default-vslwr" Apr 16 19:55:00.525394 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.525371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4e2129f-d37a-4277-bfd3-5be4dbf86524-cert\") pod \"ingress-canary-9662b\" (UID: \"e4e2129f-d37a-4277-bfd3-5be4dbf86524\") " pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:55:00.531430 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.531413 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6lbj8\"" Apr 16 19:55:00.539300 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.539286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9662b" Apr 16 19:55:00.650675 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.650647 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9662b"] Apr 16 19:55:00.653878 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:00.653850 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e2129f_d37a_4277_bfd3_5be4dbf86524.slice/crio-9a49682e5cb85f0232fe9487fa3ee4155bda86f0a44d7433b8a2701d134cd064 WatchSource:0}: Error finding container 9a49682e5cb85f0232fe9487fa3ee4155bda86f0a44d7433b8a2701d134cd064: Status 404 returned error can't find the container with id 9a49682e5cb85f0232fe9487fa3ee4155bda86f0a44d7433b8a2701d134cd064 Apr 16 19:55:00.816279 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.816187 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lx244\"" Apr 16 19:55:00.824438 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.824407 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vslwr" Apr 16 19:55:00.954252 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:00.954218 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vslwr"] Apr 16 19:55:00.957038 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:00.956998 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731a4df4_ce0e_4549_8b0a_37f41083e8a3.slice/crio-f2945257177e8c4dab3f778c3587f687c5348af8a714ad91b8a22858b382d410 WatchSource:0}: Error finding container f2945257177e8c4dab3f778c3587f687c5348af8a714ad91b8a22858b382d410: Status 404 returned error can't find the container with id f2945257177e8c4dab3f778c3587f687c5348af8a714ad91b8a22858b382d410 Apr 16 19:55:01.619116 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:01.619052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9662b" event={"ID":"e4e2129f-d37a-4277-bfd3-5be4dbf86524","Type":"ContainerStarted","Data":"9a49682e5cb85f0232fe9487fa3ee4155bda86f0a44d7433b8a2701d134cd064"} Apr 16 19:55:01.620468 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:01.620405 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vslwr" event={"ID":"731a4df4-ce0e-4549-8b0a-37f41083e8a3","Type":"ContainerStarted","Data":"f2945257177e8c4dab3f778c3587f687c5348af8a714ad91b8a22858b382d410"} Apr 16 19:55:02.035633 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.035596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:55:02.038457 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.038410 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:55:02.048662 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.048603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d7aa1-7c00-44df-9570-4435defaddc2-metrics-certs\") pod \"network-metrics-daemon-bkkfh\" (UID: \"ae8d7aa1-7c00-44df-9570-4435defaddc2\") " pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:55:02.185861 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.185826 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2m6x\"" Apr 16 19:55:02.194081 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.194054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkkfh" Apr 16 19:55:02.237996 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.237953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:55:02.248500 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.248465 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:55:02.262256 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.262225 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:55:02.272292 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.272264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7czr\" (UniqueName: \"kubernetes.io/projected/33559bf7-25ac-4de7-a712-253f87279cbf-kube-api-access-h7czr\") pod \"network-check-target-dk2hk\" (UID: \"33559bf7-25ac-4de7-a712-253f87279cbf\") " pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:55:02.479025 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.478979 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-25nwv\"" Apr 16 19:55:02.487351 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:02.487325 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:55:03.055632 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.055380 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bkkfh"] Apr 16 19:55:03.059937 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:03.059913 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8d7aa1_7c00_44df_9570_4435defaddc2.slice/crio-ca755c834179b71a02a30ced270fccb895d3bd96fa739a37ae7db6080d72cdfd WatchSource:0}: Error finding container ca755c834179b71a02a30ced270fccb895d3bd96fa739a37ae7db6080d72cdfd: Status 404 returned error can't find the container with id ca755c834179b71a02a30ced270fccb895d3bd96fa739a37ae7db6080d72cdfd Apr 16 19:55:03.076853 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.076830 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dk2hk"] Apr 16 19:55:03.632118 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.632075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vslwr" event={"ID":"731a4df4-ce0e-4549-8b0a-37f41083e8a3","Type":"ContainerStarted","Data":"cfa82a51fbf79e6d74e79a2458ffc6f82774a7c2967c9814aca96b7d75ed1a66"} Apr 16 19:55:03.632285 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.632125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vslwr" event={"ID":"731a4df4-ce0e-4549-8b0a-37f41083e8a3","Type":"ContainerStarted","Data":"065fe0c7ece368d5f9cae19293610b17901bd3b15364c1ec779d43b7e1808fce"} Apr 16 19:55:03.632361 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.632307 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vslwr" Apr 16 19:55:03.633953 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.633921 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dk2hk" event={"ID":"33559bf7-25ac-4de7-a712-253f87279cbf","Type":"ContainerStarted","Data":"b740176dfe9fc874b4a465b8cf761124bec444da8a0bf11db747d5905714d4de"} Apr 16 19:55:03.635872 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.635841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9662b" event={"ID":"e4e2129f-d37a-4277-bfd3-5be4dbf86524","Type":"ContainerStarted","Data":"64a65375a65074758e71d4e0e0399b97165f5eacd914c3ede72801fcfa71beba"} Apr 16 19:55:03.637112 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.637077 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bkkfh" event={"ID":"ae8d7aa1-7c00-44df-9570-4435defaddc2","Type":"ContainerStarted","Data":"ca755c834179b71a02a30ced270fccb895d3bd96fa739a37ae7db6080d72cdfd"} Apr 16 19:55:03.654032 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.653963 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vslwr" podStartSLOduration=33.703858555 podStartE2EDuration="35.653945995s" podCreationTimestamp="2026-04-16 19:54:28 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.958889288 +0000 UTC m=+65.159599343" lastFinishedPulling="2026-04-16 19:55:02.908976712 +0000 UTC m=+67.109686783" observedRunningTime="2026-04-16 19:55:03.651545273 +0000 UTC m=+67.852255351" watchObservedRunningTime="2026-04-16 19:55:03.653945995 +0000 UTC m=+67.854656074" Apr 16 19:55:03.672665 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:03.672605 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9662b" podStartSLOduration=33.41507026 podStartE2EDuration="35.672586404s" podCreationTimestamp="2026-04-16 19:54:28 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.65587342 +0000 UTC m=+64.856583479" lastFinishedPulling="2026-04-16 19:55:02.913389558 +0000 UTC m=+67.114099623" observedRunningTime="2026-04-16 19:55:03.672162978 +0000 UTC m=+67.872873057" watchObservedRunningTime="2026-04-16 19:55:03.672586404 +0000 UTC m=+67.873296479" Apr 16 19:55:04.643933 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:04.643834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bkkfh" event={"ID":"ae8d7aa1-7c00-44df-9570-4435defaddc2","Type":"ContainerStarted","Data":"7e5906cb03894888e43647346729120ac3bc89f861f8bcbb917037e72247ad42"} Apr 16 19:55:04.643933 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:04.643879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bkkfh" event={"ID":"ae8d7aa1-7c00-44df-9570-4435defaddc2","Type":"ContainerStarted","Data":"ebe1981091b0a9ca18ee068d9dc32dfbef279a09cc50603eb267d5573ea250e4"} Apr 16 19:55:04.662847 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:04.662795 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bkkfh" podStartSLOduration=67.559260904 podStartE2EDuration="1m8.662775732s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.062394228 +0000 UTC m=+67.263104285" lastFinishedPulling="2026-04-16 19:55:04.16590905 +0000 UTC m=+68.366619113" observedRunningTime="2026-04-16 19:55:04.661703475 +0000 UTC m=+68.862413553" watchObservedRunningTime="2026-04-16 19:55:04.662775732 +0000 UTC m=+68.863485825" Apr 16 19:55:07.654205 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:07.654165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dk2hk" event={"ID":"33559bf7-25ac-4de7-a712-253f87279cbf","Type":"ContainerStarted","Data":"819b0a742c8cca227a6c1011834ac165e57565c4f0a78f41826de05850884e19"} Apr 16 19:55:07.654696 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:07.654317 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:55:07.696215 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:07.696157 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dk2hk" podStartSLOduration=68.171959809 podStartE2EDuration="1m11.696141799s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.083728049 +0000 UTC m=+67.284438106" lastFinishedPulling="2026-04-16 19:55:06.607910027 +0000 UTC m=+70.808620096" observedRunningTime="2026-04-16 19:55:07.695606527 +0000 UTC m=+71.896316605" watchObservedRunningTime="2026-04-16 19:55:07.696141799 +0000 UTC m=+71.896851900" Apr 16 19:55:08.481830 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.481799 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ck74z"] Apr 16 19:55:08.483737 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.483715 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.487965 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.487944 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:55:08.488093 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.487972 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:55:08.488898 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.488878 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:55:08.489004 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.488908 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vh9v5\"" Apr 16 19:55:08.489081 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.489006 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:55:08.503665 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.503642 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ck74z"] Apr 16 19:55:08.565221 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.565187 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-59d85686d4-p9r27"] Apr 16 19:55:08.567136 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.567119 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.570062 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.570041 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:55:08.570360 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.570345 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:55:08.570440 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.570425 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfjh6\"" Apr 16 19:55:08.570714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.570701 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:55:08.575860 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.575838 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe910aa8-0571-400f-abf4-cac8e9f4db52-registry-certificates\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.575951 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.575872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe910aa8-0571-400f-abf4-cac8e9f4db52-image-registry-private-configuration\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.575951 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.575900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntm5\" (UniqueName: \"kubernetes.io/projected/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-kube-api-access-wntm5\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.575951 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.575918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe910aa8-0571-400f-abf4-cac8e9f4db52-installation-pull-secrets\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.576135 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-crio-socket\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.576135 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.576135 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.576287 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576137 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-registry-tls\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.576287 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576167 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe910aa8-0571-400f-abf4-cac8e9f4db52-ca-trust-extracted\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.576287 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe910aa8-0571-400f-abf4-cac8e9f4db52-trusted-ca\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.576287 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-bound-sa-token\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.576287 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576251 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-data-volume\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.576287 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.576278 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtrs8\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-kube-api-access-gtrs8\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.579358 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.579341 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:55:08.599493 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.599472 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59d85686d4-p9r27"] Apr 16 19:55:08.677326 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe910aa8-0571-400f-abf4-cac8e9f4db52-trusted-ca\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.677326 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-bound-sa-token\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-data-volume\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtrs8\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-kube-api-access-gtrs8\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe910aa8-0571-400f-abf4-cac8e9f4db52-registry-certificates\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677455 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe910aa8-0571-400f-abf4-cac8e9f4db52-image-registry-private-configuration\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wntm5\" (UniqueName: \"kubernetes.io/projected/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-kube-api-access-wntm5\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe910aa8-0571-400f-abf4-cac8e9f4db52-installation-pull-secrets\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-crio-socket\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677807 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-data-volume\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.677831 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.678488 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-registry-tls\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.678488 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe910aa8-0571-400f-abf4-cac8e9f4db52-ca-trust-extracted\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.678488 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.677890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-crio-socket\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.678488 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.678255 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe910aa8-0571-400f-abf4-cac8e9f4db52-ca-trust-extracted\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.678488 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.678331 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe910aa8-0571-400f-abf4-cac8e9f4db52-registry-certificates\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.678737 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.678580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe910aa8-0571-400f-abf4-cac8e9f4db52-trusted-ca\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.678737 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.678711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.680127 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.680098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe910aa8-0571-400f-abf4-cac8e9f4db52-image-registry-private-configuration\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.680549 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.680524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-registry-tls\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.680693 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.680605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.680693 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.680644 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe910aa8-0571-400f-abf4-cac8e9f4db52-installation-pull-secrets\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.687961 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.687941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntm5\" (UniqueName: \"kubernetes.io/projected/36e8e17e-1c2e-409c-b778-fb3de7ae8bb2-kube-api-access-wntm5\") pod \"insights-runtime-extractor-ck74z\" (UID: \"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2\") " pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.689308 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.689287 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-bound-sa-token\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.689411 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.689398 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtrs8\" (UniqueName: \"kubernetes.io/projected/fe910aa8-0571-400f-abf4-cac8e9f4db52-kube-api-access-gtrs8\") pod \"image-registry-59d85686d4-p9r27\" (UID: \"fe910aa8-0571-400f-abf4-cac8e9f4db52\") " pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.793491 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.793389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ck74z" Apr 16 19:55:08.876188 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.876157 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:08.918225 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:08.918193 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ck74z"] Apr 16 19:55:08.920549 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:08.920521 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e8e17e_1c2e_409c_b778_fb3de7ae8bb2.slice/crio-291da9e1d7f3eaed108443af2c19050db157bf8308aa25f3a6c8a047dc903900 WatchSource:0}: Error finding container 291da9e1d7f3eaed108443af2c19050db157bf8308aa25f3a6c8a047dc903900: Status 404 returned error can't find the container with id 291da9e1d7f3eaed108443af2c19050db157bf8308aa25f3a6c8a047dc903900 Apr 16 19:55:09.005094 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.005068 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59d85686d4-p9r27"] Apr 16 19:55:09.007635 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:09.007609 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe910aa8_0571_400f_abf4_cac8e9f4db52.slice/crio-edbbcc94b6a957078a2548f33b5fe79cd447af9492884bb9551cbb23f188e78a WatchSource:0}: Error finding container edbbcc94b6a957078a2548f33b5fe79cd447af9492884bb9551cbb23f188e78a: Status 404 returned error can't find the container with id edbbcc94b6a957078a2548f33b5fe79cd447af9492884bb9551cbb23f188e78a Apr 16 19:55:09.661787 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.661749 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ck74z" event={"ID":"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2","Type":"ContainerStarted","Data":"d28c8f850ef12f1b93d4c1948f5022ef3d3c17630259a1bff61ecb31a6d3add9"} Apr 16 19:55:09.661787 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.661792 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ck74z" event={"ID":"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2","Type":"ContainerStarted","Data":"476111644b4ed6d44de542623f60563d560e517f2490ce838a555429e4ce2d21"} Apr 16 19:55:09.662031 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.661804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ck74z" event={"ID":"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2","Type":"ContainerStarted","Data":"291da9e1d7f3eaed108443af2c19050db157bf8308aa25f3a6c8a047dc903900"} Apr 16 19:55:09.662986 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.662954 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" event={"ID":"fe910aa8-0571-400f-abf4-cac8e9f4db52","Type":"ContainerStarted","Data":"ff9642cb68c3d202880d7e272b3336163a8be8b725563887bd6bd4b431b61a87"} Apr 16 19:55:09.662986 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.662988 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" event={"ID":"fe910aa8-0571-400f-abf4-cac8e9f4db52","Type":"ContainerStarted","Data":"edbbcc94b6a957078a2548f33b5fe79cd447af9492884bb9551cbb23f188e78a"} Apr 16 19:55:09.663194 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.663060 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:09.685171 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:09.685116 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" podStartSLOduration=1.685098323 podStartE2EDuration="1.685098323s" podCreationTimestamp="2026-04-16 19:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:09.683547058 +0000 UTC m=+73.884257137" watchObservedRunningTime="2026-04-16 19:55:09.685098323 +0000 UTC m=+73.885808403" Apr 16 19:55:11.670224 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:11.670183 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ck74z" event={"ID":"36e8e17e-1c2e-409c-b778-fb3de7ae8bb2","Type":"ContainerStarted","Data":"34cc9440812e8fcd544062ce724c77516bf7e078718e59459e91d6789f335c92"} Apr 16 19:55:11.693458 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:11.693400 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ck74z" podStartSLOduration=1.9791339190000001 podStartE2EDuration="3.69338528s" podCreationTimestamp="2026-04-16 19:55:08 +0000 UTC" firstStartedPulling="2026-04-16 19:55:08.9841746 +0000 UTC m=+73.184884656" lastFinishedPulling="2026-04-16 19:55:10.698425947 +0000 UTC m=+74.899136017" observedRunningTime="2026-04-16 19:55:11.691714785 +0000 UTC m=+75.892424857" watchObservedRunningTime="2026-04-16 19:55:11.69338528 +0000 UTC m=+75.894095358" Apr 16 19:55:13.646298 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:13.646265 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vslwr" Apr 16 19:55:30.670247 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:30.670216 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-59d85686d4-p9r27" Apr 16 19:55:38.660130 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:38.660099 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dk2hk" Apr 16 19:55:39.068083 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.067984 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77f49d6849-6xc6m"] Apr 16 19:55:39.071202 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.071185 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.074685 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.074664 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:55:39.074797 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.074683 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nt2zh\"" Apr 16 19:55:39.074797 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.074704 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:55:39.074797 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.074717 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:55:39.075118 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.075099 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:55:39.075198 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.075169 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:55:39.075255 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.075239 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:55:39.075377 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.075358 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:55:39.079064 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.079045 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:55:39.082305 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.082288 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77f49d6849-6xc6m"] Apr 16 19:55:39.181849 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.181813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-serving-cert\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.181849 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.181854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-service-ca\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.182082 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.181873 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szclb\" (UniqueName: \"kubernetes.io/projected/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-kube-api-access-szclb\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.182082 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.181936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-config\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.182082 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.181975 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-oauth-serving-cert\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.182082 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.181992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-trusted-ca-bundle\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.182082 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.182006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-oauth-config\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283345 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-oauth-config\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283513 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-serving-cert\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283513 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-service-ca\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283513 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szclb\" (UniqueName: \"kubernetes.io/projected/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-kube-api-access-szclb\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283513 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-config\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283513 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-oauth-serving-cert\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.283513 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.283473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-trusted-ca-bundle\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.284217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.284192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-service-ca\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.284338 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.284302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-config\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.284399 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.284347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-oauth-serving-cert\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.284399 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.284372 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-trusted-ca-bundle\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.286026 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.285991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-serving-cert\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.286107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.285991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-oauth-config\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.292774 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.292756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szclb\" (UniqueName: \"kubernetes.io/projected/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-kube-api-access-szclb\") pod \"console-77f49d6849-6xc6m\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.380729 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.380635 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:39.520741 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.520708 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77f49d6849-6xc6m"] Apr 16 19:55:39.523854 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:39.523827 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c8ce9b_4507_45a0_ad3f_9e19c642153f.slice/crio-b040881b329acf95ae4ba33c43906f9ba12ba5e02ac5e8f40d5cbc75517dd234 WatchSource:0}: Error finding container b040881b329acf95ae4ba33c43906f9ba12ba5e02ac5e8f40d5cbc75517dd234: Status 404 returned error can't find the container with id b040881b329acf95ae4ba33c43906f9ba12ba5e02ac5e8f40d5cbc75517dd234 Apr 16 19:55:39.744107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:39.744074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f49d6849-6xc6m" event={"ID":"f9c8ce9b-4507-45a0-ad3f-9e19c642153f","Type":"ContainerStarted","Data":"b040881b329acf95ae4ba33c43906f9ba12ba5e02ac5e8f40d5cbc75517dd234"} Apr 16 19:55:42.757657 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:42.757621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f49d6849-6xc6m" event={"ID":"f9c8ce9b-4507-45a0-ad3f-9e19c642153f","Type":"ContainerStarted","Data":"e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b"} Apr 16 19:55:42.777589 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:42.777539 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77f49d6849-6xc6m" podStartSLOduration=1.3369442839999999 podStartE2EDuration="3.777522091s" podCreationTimestamp="2026-04-16 19:55:39 +0000 UTC" firstStartedPulling="2026-04-16 19:55:39.525588716 +0000 UTC m=+103.726298773" lastFinishedPulling="2026-04-16 19:55:41.966166515 +0000 UTC m=+106.166876580" observedRunningTime="2026-04-16 19:55:42.776684209 +0000 UTC m=+106.977394290" watchObservedRunningTime="2026-04-16 19:55:42.777522091 +0000 UTC m=+106.978232172" Apr 16 19:55:43.309413 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.309376 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cx9gk"] Apr 16 19:55:43.312733 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.312712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.318229 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.318199 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:43.318382 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.318359 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:55:43.318910 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.318894 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:43.319134 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.319117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:43.319233 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.319147 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wxw92\"" Apr 16 19:55:43.322692 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.322673 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:43.322785 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.322709 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:55:43.415915 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.415883 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-root\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416106 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.415924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-accelerators-collector-config\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416106 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.415989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-sys\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416106 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.416062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gksl\" (UniqueName: \"kubernetes.io/projected/b518de3d-357a-4cb8-987f-4bea66391003-kube-api-access-6gksl\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416241 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.416107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-textfile\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416241 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.416141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-wtmp\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416241 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.416179 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416354 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.416255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b518de3d-357a-4cb8-987f-4bea66391003-metrics-client-ca\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.416354 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.416328 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-tls\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517295 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gksl\" (UniqueName: \"kubernetes.io/projected/b518de3d-357a-4cb8-987f-4bea66391003-kube-api-access-6gksl\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517503 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-textfile\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517503 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-wtmp\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517503 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517503 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b518de3d-357a-4cb8-987f-4bea66391003-metrics-client-ca\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517503 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-tls\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517503 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517491 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-root\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-accelerators-collector-config\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-sys\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.517812 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.517641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-sys\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.518179 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.518152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-root\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.518298 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.518152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-wtmp\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.518374 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.518328 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-textfile\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.518789 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.518764 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b518de3d-357a-4cb8-987f-4bea66391003-metrics-client-ca\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.518789 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.518776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-accelerators-collector-config\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.520617 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.520596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-tls\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.521377 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.521349 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b518de3d-357a-4cb8-987f-4bea66391003-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.524615 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.524598 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gksl\" (UniqueName: \"kubernetes.io/projected/b518de3d-357a-4cb8-987f-4bea66391003-kube-api-access-6gksl\") pod \"node-exporter-cx9gk\" (UID: \"b518de3d-357a-4cb8-987f-4bea66391003\") " pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.622581 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.622479 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cx9gk" Apr 16 19:55:43.630909 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:43.630885 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb518de3d_357a_4cb8_987f_4bea66391003.slice/crio-79075a52e0f66e35e4dd4627fe2cd6a75e4e3aca5d64cd8970c05c27b1510d4a WatchSource:0}: Error finding container 79075a52e0f66e35e4dd4627fe2cd6a75e4e3aca5d64cd8970c05c27b1510d4a: Status 404 returned error can't find the container with id 79075a52e0f66e35e4dd4627fe2cd6a75e4e3aca5d64cd8970c05c27b1510d4a Apr 16 19:55:43.761678 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:43.761644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cx9gk" event={"ID":"b518de3d-357a-4cb8-987f-4bea66391003","Type":"ContainerStarted","Data":"79075a52e0f66e35e4dd4627fe2cd6a75e4e3aca5d64cd8970c05c27b1510d4a"} Apr 16 19:55:44.765231 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:44.765198 2569 generic.go:358] "Generic (PLEG): container finished" podID="b518de3d-357a-4cb8-987f-4bea66391003" containerID="fc8962166edbaabe3400157d7f32213fa07d0913d1feabe26dfbca4805170d01" exitCode=0 Apr 16 19:55:44.765612 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:44.765238 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cx9gk" event={"ID":"b518de3d-357a-4cb8-987f-4bea66391003","Type":"ContainerDied","Data":"fc8962166edbaabe3400157d7f32213fa07d0913d1feabe26dfbca4805170d01"} Apr 16 19:55:45.769659 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:45.769621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cx9gk" event={"ID":"b518de3d-357a-4cb8-987f-4bea66391003","Type":"ContainerStarted","Data":"56517238c6150bd1ae14c3fc07ecf76914a2780ebe4e7b3a0a3ae578f6d1f2d3"} Apr 16 19:55:45.769659 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:45.769658 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cx9gk" event={"ID":"b518de3d-357a-4cb8-987f-4bea66391003","Type":"ContainerStarted","Data":"3229687f60d485b38ef19032d33b34aba4c91801694b36c6eb282b28722490de"} Apr 16 19:55:45.793032 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:45.792973 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cx9gk" podStartSLOduration=2.143156895 podStartE2EDuration="2.792958205s" podCreationTimestamp="2026-04-16 19:55:43 +0000 UTC" firstStartedPulling="2026-04-16 19:55:43.632879187 +0000 UTC m=+107.833589250" lastFinishedPulling="2026-04-16 19:55:44.282680487 +0000 UTC m=+108.483390560" observedRunningTime="2026-04-16 19:55:45.791299756 +0000 UTC m=+109.992009844" watchObservedRunningTime="2026-04-16 19:55:45.792958205 +0000 UTC m=+109.993668283" Apr 16 19:55:47.752787 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.752746 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5648b74476-mzrcm"] Apr 16 19:55:47.754940 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.754916 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.757304 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.757278 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 19:55:47.757475 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.757437 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 19:55:47.757571 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.757443 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 19:55:47.757571 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.757490 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-bnfv7\"" Apr 16 19:55:47.757571 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.757550 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:47.757767 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.757752 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-du2mm768797d6\"" Apr 16 19:55:47.763934 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.763916 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5648b74476-mzrcm"] Apr 16 19:55:47.851071 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-client-ca-bundle\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.851071 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-secret-metrics-server-tls\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.851307 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/663570b5-b814-4a5e-96a2-36a946159c78-audit-log\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.851307 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851119 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq9qz\" (UniqueName: \"kubernetes.io/projected/663570b5-b814-4a5e-96a2-36a946159c78-kube-api-access-tq9qz\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.851307 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-secret-metrics-server-client-certs\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.851307 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/663570b5-b814-4a5e-96a2-36a946159c78-metrics-server-audit-profiles\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.851307 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.851256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/663570b5-b814-4a5e-96a2-36a946159c78-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.951942 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.951901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-secret-metrics-server-client-certs\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.951945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/663570b5-b814-4a5e-96a2-36a946159c78-metrics-server-audit-profiles\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.951986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/663570b5-b814-4a5e-96a2-36a946159c78-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.952051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-client-ca-bundle\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.952076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-secret-metrics-server-tls\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.952098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/663570b5-b814-4a5e-96a2-36a946159c78-audit-log\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.952123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq9qz\" (UniqueName: \"kubernetes.io/projected/663570b5-b814-4a5e-96a2-36a946159c78-kube-api-access-tq9qz\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952776 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.952747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/663570b5-b814-4a5e-96a2-36a946159c78-audit-log\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.952914 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.952819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/663570b5-b814-4a5e-96a2-36a946159c78-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.953042 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.953003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/663570b5-b814-4a5e-96a2-36a946159c78-metrics-server-audit-profiles\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.954518 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.954489 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-secret-metrics-server-client-certs\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.954611 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.954567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-client-ca-bundle\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.954649 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.954623 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/663570b5-b814-4a5e-96a2-36a946159c78-secret-metrics-server-tls\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:47.960323 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:47.960304 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq9qz\" (UniqueName: \"kubernetes.io/projected/663570b5-b814-4a5e-96a2-36a946159c78-kube-api-access-tq9qz\") pod \"metrics-server-5648b74476-mzrcm\" (UID: \"663570b5-b814-4a5e-96a2-36a946159c78\") " pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:48.058519 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.058420 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77f49d6849-6xc6m"] Apr 16 19:55:48.064257 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.064030 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:55:48.182727 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.182692 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5648b74476-mzrcm"] Apr 16 19:55:48.185526 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:48.185501 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663570b5_b814_4a5e_96a2_36a946159c78.slice/crio-4ed38640720c27b617ab65286f3f62188f8bf64b15072a448642dd5de3e9499d WatchSource:0}: Error finding container 4ed38640720c27b617ab65286f3f62188f8bf64b15072a448642dd5de3e9499d: Status 404 returned error can't find the container with id 4ed38640720c27b617ab65286f3f62188f8bf64b15072a448642dd5de3e9499d Apr 16 19:55:48.523801 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.523765 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw"] Apr 16 19:55:48.527033 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.526990 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.529592 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.529569 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 19:55:48.529725 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.529574 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 19:55:48.529875 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.529858 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 19:55:48.530071 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.530055 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gtzc8\"" Apr 16 19:55:48.530155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.530080 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 19:55:48.530155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.530096 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 19:55:48.534670 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.534647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 19:55:48.539156 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.539133 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw"] Apr 16 19:55:48.657282 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-federate-client-tls\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657460 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-metrics-client-ca\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657460 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657361 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657588 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9cnj\" (UniqueName: \"kubernetes.io/projected/61d996e3-12d0-435c-8089-44bb6cb2698a-kube-api-access-z9cnj\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657588 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657588 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-telemeter-client-tls\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-secret-telemeter-client\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.657728 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.657657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-serving-certs-ca-bundle\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758256 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-secret-telemeter-client\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-serving-certs-ca-bundle\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-federate-client-tls\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-metrics-client-ca\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9cnj\" (UniqueName: \"kubernetes.io/projected/61d996e3-12d0-435c-8089-44bb6cb2698a-kube-api-access-z9cnj\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.758714 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.758505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-telemeter-client-tls\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.759150 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.759096 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-serving-certs-ca-bundle\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.759737 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.759695 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-metrics-client-ca\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.760452 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.760416 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d996e3-12d0-435c-8089-44bb6cb2698a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.761373 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.761351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.761549 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.761528 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-secret-telemeter-client\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.761724 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.761705 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-federate-client-tls\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.762057 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.762041 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/61d996e3-12d0-435c-8089-44bb6cb2698a-telemeter-client-tls\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.767576 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.767548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9cnj\" (UniqueName: \"kubernetes.io/projected/61d996e3-12d0-435c-8089-44bb6cb2698a-kube-api-access-z9cnj\") pod \"telemeter-client-ddd77cfb9-dq2tw\" (UID: \"61d996e3-12d0-435c-8089-44bb6cb2698a\") " pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.783247 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.783165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" event={"ID":"663570b5-b814-4a5e-96a2-36a946159c78","Type":"ContainerStarted","Data":"4ed38640720c27b617ab65286f3f62188f8bf64b15072a448642dd5de3e9499d"} Apr 16 19:55:48.836661 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.836625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" Apr 16 19:55:48.973309 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:48.973283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw"] Apr 16 19:55:49.334599 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:49.334568 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d996e3_12d0_435c_8089_44bb6cb2698a.slice/crio-897615828a4c7e979cb6c4f1663d8cfdb03bbe9e1ca007a100e14938e539f51e WatchSource:0}: Error finding container 897615828a4c7e979cb6c4f1663d8cfdb03bbe9e1ca007a100e14938e539f51e: Status 404 returned error can't find the container with id 897615828a4c7e979cb6c4f1663d8cfdb03bbe9e1ca007a100e14938e539f51e Apr 16 19:55:49.380984 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:49.380961 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:55:49.786656 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:49.786622 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" event={"ID":"61d996e3-12d0-435c-8089-44bb6cb2698a","Type":"ContainerStarted","Data":"897615828a4c7e979cb6c4f1663d8cfdb03bbe9e1ca007a100e14938e539f51e"} Apr 16 19:55:49.787881 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:49.787854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" event={"ID":"663570b5-b814-4a5e-96a2-36a946159c78","Type":"ContainerStarted","Data":"dea3169fe27127b0ce835f5f035af8a82193a43a20c0591137afadf59652e938"} Apr 16 19:55:49.804921 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:49.804875 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" podStartSLOduration=1.632185821 podStartE2EDuration="2.804861998s" podCreationTimestamp="2026-04-16 19:55:47 +0000 UTC" firstStartedPulling="2026-04-16 19:55:48.187474541 +0000 UTC m=+112.388184597" lastFinishedPulling="2026-04-16 19:55:49.360150717 +0000 UTC m=+113.560860774" observedRunningTime="2026-04-16 19:55:49.803885756 +0000 UTC m=+114.004595835" watchObservedRunningTime="2026-04-16 19:55:49.804861998 +0000 UTC m=+114.005572076" Apr 16 19:55:51.795216 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:51.795131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" event={"ID":"61d996e3-12d0-435c-8089-44bb6cb2698a","Type":"ContainerStarted","Data":"9fd25191d8c45e425ecc7fb75d513d3ee86954386c225f0c8d02ab6bbcf3f0b9"} Apr 16 19:55:52.799340 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:52.799254 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" event={"ID":"61d996e3-12d0-435c-8089-44bb6cb2698a","Type":"ContainerStarted","Data":"5548a77cd1bba01366120e8c28a0958f2d24ec2739b03c2aad984b739ac3ae35"} Apr 16 19:55:52.799340 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:52.799292 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" event={"ID":"61d996e3-12d0-435c-8089-44bb6cb2698a","Type":"ContainerStarted","Data":"fa63885b25d2c010ba59382103b7278d57471696e902cf94cf24d45e4001ba1b"} Apr 16 19:55:52.824160 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:52.824109 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-ddd77cfb9-dq2tw" podStartSLOduration=1.667866009 podStartE2EDuration="4.824090643s" podCreationTimestamp="2026-04-16 19:55:48 +0000 UTC" firstStartedPulling="2026-04-16 19:55:49.336450153 +0000 UTC m=+113.537160209" lastFinishedPulling="2026-04-16 19:55:52.492674771 +0000 UTC m=+116.693384843" observedRunningTime="2026-04-16 19:55:52.822807863 +0000 UTC m=+117.023517940" watchObservedRunningTime="2026-04-16 19:55:52.824090643 +0000 UTC m=+117.024800724" Apr 16 19:55:54.062727 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.062693 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86bddf6c4f-kk92w"] Apr 16 19:55:54.064770 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.064755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.076891 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.076868 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bddf6c4f-kk92w"] Apr 16 19:55:54.200384 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-console-config\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.200384 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-oauth-config\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.200618 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200445 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-trusted-ca-bundle\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.200618 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68qjq\" (UniqueName: \"kubernetes.io/projected/059f4b74-f031-4d99-87e3-df20cfa695db-kube-api-access-68qjq\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.200618 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-oauth-serving-cert\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.200618 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-service-ca\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.200618 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.200616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-serving-cert\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301693 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-serving-cert\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301840 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-console-config\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301840 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-oauth-config\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301840 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-trusted-ca-bundle\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301840 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68qjq\" (UniqueName: \"kubernetes.io/projected/059f4b74-f031-4d99-87e3-df20cfa695db-kube-api-access-68qjq\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301989 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-oauth-serving-cert\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.301989 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.301911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-service-ca\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.302671 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.302646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-service-ca\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.302671 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.302662 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-oauth-serving-cert\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.302848 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.302825 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-trusted-ca-bundle\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.303443 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.303420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-console-config\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.304375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.304340 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-oauth-config\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.304461 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.304447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-serving-cert\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.309818 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.309795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68qjq\" (UniqueName: \"kubernetes.io/projected/059f4b74-f031-4d99-87e3-df20cfa695db-kube-api-access-68qjq\") pod \"console-86bddf6c4f-kk92w\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.373387 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.373327 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:55:54.487746 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.487721 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bddf6c4f-kk92w"] Apr 16 19:55:54.489912 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:55:54.489877 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059f4b74_f031_4d99_87e3_df20cfa695db.slice/crio-5e53b4bb7dfcd1da1971712ad59642a5ae1026b6118d38523d0f851393fb7f50 WatchSource:0}: Error finding container 5e53b4bb7dfcd1da1971712ad59642a5ae1026b6118d38523d0f851393fb7f50: Status 404 returned error can't find the container with id 5e53b4bb7dfcd1da1971712ad59642a5ae1026b6118d38523d0f851393fb7f50 Apr 16 19:55:54.805955 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.805908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bddf6c4f-kk92w" event={"ID":"059f4b74-f031-4d99-87e3-df20cfa695db","Type":"ContainerStarted","Data":"9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd"} Apr 16 19:55:54.805955 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.805956 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bddf6c4f-kk92w" event={"ID":"059f4b74-f031-4d99-87e3-df20cfa695db","Type":"ContainerStarted","Data":"5e53b4bb7dfcd1da1971712ad59642a5ae1026b6118d38523d0f851393fb7f50"} Apr 16 19:55:54.825415 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:55:54.825363 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86bddf6c4f-kk92w" podStartSLOduration=0.825346321 podStartE2EDuration="825.346321ms" podCreationTimestamp="2026-04-16 19:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:54.823513945 +0000 UTC m=+119.024224023" watchObservedRunningTime="2026-04-16 19:55:54.825346321 +0000 UTC m=+119.026056399" Apr 16 19:56:04.373453 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:04.373424 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:56:04.373453 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:04.373458 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:56:04.378123 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:04.378101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:56:04.838113 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:04.838080 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:56:08.065161 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:08.065119 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:56:08.065161 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:08.065166 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:56:13.081825 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.081746 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77f49d6849-6xc6m" podUID="f9c8ce9b-4507-45a0-ad3f-9e19c642153f" containerName="console" containerID="cri-o://e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b" gracePeriod=15 Apr 16 19:56:13.321774 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.321742 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f49d6849-6xc6m_f9c8ce9b-4507-45a0-ad3f-9e19c642153f/console/0.log" Apr 16 19:56:13.321895 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.321816 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:56:13.464215 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464167 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-oauth-config\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464215 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464222 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-trusted-ca-bundle\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464434 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464253 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-service-ca\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464434 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464358 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-config\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464434 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464419 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-serving-cert\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464571 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464471 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szclb\" (UniqueName: \"kubernetes.io/projected/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-kube-api-access-szclb\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464571 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464528 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-oauth-serving-cert\") pod \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\" (UID: \"f9c8ce9b-4507-45a0-ad3f-9e19c642153f\") " Apr 16 19:56:13.464690 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464652 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-service-ca" (OuterVolumeSpecName: "service-ca") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:13.464690 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464665 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:13.464790 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464707 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-config" (OuterVolumeSpecName: "console-config") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:13.464840 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464791 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-trusted-ca-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.464951 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464933 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-service-ca\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.465027 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.464958 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.465081 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.465059 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:13.466565 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.466542 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-kube-api-access-szclb" (OuterVolumeSpecName: "kube-api-access-szclb") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "kube-api-access-szclb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:13.466648 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.466560 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:13.466648 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.466575 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f9c8ce9b-4507-45a0-ad3f-9e19c642153f" (UID: "f9c8ce9b-4507-45a0-ad3f-9e19c642153f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:13.566171 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.566116 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-oauth-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.566171 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.566162 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-oauth-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.566171 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.566172 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-console-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.566171 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.566182 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szclb\" (UniqueName: \"kubernetes.io/projected/f9c8ce9b-4507-45a0-ad3f-9e19c642153f-kube-api-access-szclb\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:56:13.858768 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.858692 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f49d6849-6xc6m_f9c8ce9b-4507-45a0-ad3f-9e19c642153f/console/0.log" Apr 16 19:56:13.858768 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.858733 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9c8ce9b-4507-45a0-ad3f-9e19c642153f" containerID="e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b" exitCode=2 Apr 16 19:56:13.858970 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.858768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f49d6849-6xc6m" event={"ID":"f9c8ce9b-4507-45a0-ad3f-9e19c642153f","Type":"ContainerDied","Data":"e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b"} Apr 16 19:56:13.858970 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.858791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f49d6849-6xc6m" event={"ID":"f9c8ce9b-4507-45a0-ad3f-9e19c642153f","Type":"ContainerDied","Data":"b040881b329acf95ae4ba33c43906f9ba12ba5e02ac5e8f40d5cbc75517dd234"} Apr 16 19:56:13.858970 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.858804 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f49d6849-6xc6m" Apr 16 19:56:13.858970 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.858815 2569 scope.go:117] "RemoveContainer" containerID="e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b" Apr 16 19:56:13.866905 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.866863 2569 scope.go:117] "RemoveContainer" containerID="e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b" Apr 16 19:56:13.867194 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:56:13.867169 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b\": container with ID starting with e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b not found: ID does not exist" containerID="e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b" Apr 16 19:56:13.867295 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.867198 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b"} err="failed to get container status \"e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b\": rpc error: code = NotFound desc = could not find container \"e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b\": container with ID starting with e2753df817266bc2194091e5645cddc77bb27eea4964de57f914499a154e190b not found: ID does not exist" Apr 16 19:56:13.880815 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.880787 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77f49d6849-6xc6m"] Apr 16 19:56:13.886146 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:13.886121 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77f49d6849-6xc6m"] Apr 16 19:56:14.371356 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:14.371322 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c8ce9b-4507-45a0-ad3f-9e19c642153f" path="/var/lib/kubelet/pods/f9c8ce9b-4507-45a0-ad3f-9e19c642153f/volumes" Apr 16 19:56:17.002888 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:17.002862 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9662b_e4e2129f-d37a-4277-bfd3-5be4dbf86524/serve-healthcheck-canary/0.log" Apr 16 19:56:28.070472 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:28.070442 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:56:28.074906 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:28.074877 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5648b74476-mzrcm" Apr 16 19:56:59.664004 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.663966 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fccf94886-pzr8t"] Apr 16 19:56:59.664516 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.664333 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9c8ce9b-4507-45a0-ad3f-9e19c642153f" containerName="console" Apr 16 19:56:59.664516 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.664352 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c8ce9b-4507-45a0-ad3f-9e19c642153f" containerName="console" Apr 16 19:56:59.664516 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.664427 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9c8ce9b-4507-45a0-ad3f-9e19c642153f" containerName="console" Apr 16 19:56:59.668826 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.668803 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.677977 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.677955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvbjr\" (UniqueName: \"kubernetes.io/projected/52c662a5-de48-45c3-8f35-8217c2ccacd3-kube-api-access-gvbjr\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.678108 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.677991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-serving-cert\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.678108 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.678032 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-trusted-ca-bundle\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.678108 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.678054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-oauth-serving-cert\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.678108 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.678071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-service-ca\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.678247 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.678128 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-config\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.678247 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.678150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-oauth-config\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.681485 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.681461 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fccf94886-pzr8t"] Apr 16 19:56:59.778660 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778625 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-serving-cert\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.778660 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-trusted-ca-bundle\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.778921 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-oauth-serving-cert\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.778921 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-service-ca\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.778921 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-config\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.779076 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-oauth-config\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.779076 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.778967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvbjr\" (UniqueName: \"kubernetes.io/projected/52c662a5-de48-45c3-8f35-8217c2ccacd3-kube-api-access-gvbjr\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.779639 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.779612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-config\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.779769 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.779612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-oauth-serving-cert\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.779865 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.779839 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-trusted-ca-bundle\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.781370 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.781342 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-serving-cert\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.781476 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.781374 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-oauth-config\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.781476 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.781431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-service-ca\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.786929 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.786911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvbjr\" (UniqueName: \"kubernetes.io/projected/52c662a5-de48-45c3-8f35-8217c2ccacd3-kube-api-access-gvbjr\") pod \"console-5fccf94886-pzr8t\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:56:59.977397 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:56:59.977300 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:57:00.097455 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:00.097422 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fccf94886-pzr8t"] Apr 16 19:57:00.099691 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:57:00.099663 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c662a5_de48_45c3_8f35_8217c2ccacd3.slice/crio-6502e8ee76bdeea6908ebeb22313a9a947b7cfb1ff3a0338d356e4103d454643 WatchSource:0}: Error finding container 6502e8ee76bdeea6908ebeb22313a9a947b7cfb1ff3a0338d356e4103d454643: Status 404 returned error can't find the container with id 6502e8ee76bdeea6908ebeb22313a9a947b7cfb1ff3a0338d356e4103d454643 Apr 16 19:57:00.988514 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:00.988473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fccf94886-pzr8t" event={"ID":"52c662a5-de48-45c3-8f35-8217c2ccacd3","Type":"ContainerStarted","Data":"6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1"} Apr 16 19:57:00.988514 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:00.988518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fccf94886-pzr8t" event={"ID":"52c662a5-de48-45c3-8f35-8217c2ccacd3","Type":"ContainerStarted","Data":"6502e8ee76bdeea6908ebeb22313a9a947b7cfb1ff3a0338d356e4103d454643"} Apr 16 19:57:01.005077 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:01.004986 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fccf94886-pzr8t" podStartSLOduration=2.004968087 podStartE2EDuration="2.004968087s" podCreationTimestamp="2026-04-16 19:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:57:01.004344566 +0000 UTC m=+185.205054645" watchObservedRunningTime="2026-04-16 19:57:01.004968087 +0000 UTC m=+185.205678168" Apr 16 19:57:09.978280 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:09.978246 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:57:09.978729 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:09.978338 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:57:09.982839 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:09.982823 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:57:10.020623 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:10.020601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 19:57:10.068037 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:10.067994 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86bddf6c4f-kk92w"] Apr 16 19:57:35.086292 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.086234 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86bddf6c4f-kk92w" podUID="059f4b74-f031-4d99-87e3-df20cfa695db" containerName="console" containerID="cri-o://9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd" gracePeriod=15 Apr 16 19:57:35.309605 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.309583 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bddf6c4f-kk92w_059f4b74-f031-4d99-87e3-df20cfa695db/console/0.log" Apr 16 19:57:35.309713 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.309641 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:57:35.440540 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440511 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-service-ca\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.440688 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440555 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-console-config\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.440688 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440577 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-trusted-ca-bundle\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.440688 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440608 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-oauth-config\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.440688 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440628 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-oauth-serving-cert\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.440688 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440663 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68qjq\" (UniqueName: \"kubernetes.io/projected/059f4b74-f031-4d99-87e3-df20cfa695db-kube-api-access-68qjq\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.440943 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.440691 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-serving-cert\") pod \"059f4b74-f031-4d99-87e3-df20cfa695db\" (UID: \"059f4b74-f031-4d99-87e3-df20cfa695db\") " Apr 16 19:57:35.441120 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.441091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:35.441120 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.441106 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-service-ca" (OuterVolumeSpecName: "service-ca") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:35.441245 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.441228 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-console-config" (OuterVolumeSpecName: "console-config") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:35.441298 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.441235 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:35.442906 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.442884 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:35.442988 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.442936 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059f4b74-f031-4d99-87e3-df20cfa695db-kube-api-access-68qjq" (OuterVolumeSpecName: "kube-api-access-68qjq") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "kube-api-access-68qjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:35.443086 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.442985 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "059f4b74-f031-4d99-87e3-df20cfa695db" (UID: "059f4b74-f031-4d99-87e3-df20cfa695db"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:35.541489 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541461 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-console-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:35.541489 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541486 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-trusted-ca-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:35.541617 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541496 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-oauth-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:35.541617 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541505 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-oauth-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:35.541617 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541528 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68qjq\" (UniqueName: \"kubernetes.io/projected/059f4b74-f031-4d99-87e3-df20cfa695db-kube-api-access-68qjq\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:35.541617 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541538 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/059f4b74-f031-4d99-87e3-df20cfa695db-console-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:35.541617 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:35.541546 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/059f4b74-f031-4d99-87e3-df20cfa695db-service-ca\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:57:36.083987 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.083962 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bddf6c4f-kk92w_059f4b74-f031-4d99-87e3-df20cfa695db/console/0.log" Apr 16 19:57:36.084139 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.083998 2569 generic.go:358] "Generic (PLEG): container finished" podID="059f4b74-f031-4d99-87e3-df20cfa695db" containerID="9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd" exitCode=2 Apr 16 19:57:36.084139 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.084037 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bddf6c4f-kk92w" event={"ID":"059f4b74-f031-4d99-87e3-df20cfa695db","Type":"ContainerDied","Data":"9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd"} Apr 16 19:57:36.084139 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.084070 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bddf6c4f-kk92w" Apr 16 19:57:36.084139 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.084085 2569 scope.go:117] "RemoveContainer" containerID="9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd" Apr 16 19:57:36.084311 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.084075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bddf6c4f-kk92w" event={"ID":"059f4b74-f031-4d99-87e3-df20cfa695db","Type":"ContainerDied","Data":"5e53b4bb7dfcd1da1971712ad59642a5ae1026b6118d38523d0f851393fb7f50"} Apr 16 19:57:36.092189 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.092004 2569 scope.go:117] "RemoveContainer" containerID="9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd" Apr 16 19:57:36.092395 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:57:36.092234 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd\": container with ID starting with 9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd not found: ID does not exist" containerID="9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd" Apr 16 19:57:36.092395 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.092258 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd"} err="failed to get container status \"9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd\": rpc error: code = NotFound desc = could not find container \"9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd\": container with ID starting with 9a31c20e4a3fde16b9d4230b9926d8758fcd926621c0a7b17777e1258eff31fd not found: ID does not exist" Apr 16 19:57:36.104913 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.104890 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86bddf6c4f-kk92w"] Apr 16 19:57:36.107144 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.107119 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86bddf6c4f-kk92w"] Apr 16 19:57:36.370162 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:36.370090 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059f4b74-f031-4d99-87e3-df20cfa695db" path="/var/lib/kubelet/pods/059f4b74-f031-4d99-87e3-df20cfa695db/volumes" Apr 16 19:57:55.117490 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.117453 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zfltp"] Apr 16 19:57:55.117920 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.117696 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="059f4b74-f031-4d99-87e3-df20cfa695db" containerName="console" Apr 16 19:57:55.117920 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.117707 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="059f4b74-f031-4d99-87e3-df20cfa695db" containerName="console" Apr 16 19:57:55.117920 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.117758 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="059f4b74-f031-4d99-87e3-df20cfa695db" containerName="console" Apr 16 19:57:55.120401 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.120380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.123134 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.123114 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:57:55.127944 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.127896 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zfltp"] Apr 16 19:57:55.187780 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.187743 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8513c710-eca6-4743-9c4a-9f4603f59a26-kubelet-config\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.187959 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.187803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8513c710-eca6-4743-9c4a-9f4603f59a26-original-pull-secret\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.187959 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.187869 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8513c710-eca6-4743-9c4a-9f4603f59a26-dbus\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.289170 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.289133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8513c710-eca6-4743-9c4a-9f4603f59a26-kubelet-config\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.289293 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.289224 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8513c710-eca6-4743-9c4a-9f4603f59a26-original-pull-secret\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.289293 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.289139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8513c710-eca6-4743-9c4a-9f4603f59a26-kubelet-config\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.289293 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.289264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8513c710-eca6-4743-9c4a-9f4603f59a26-dbus\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.289397 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.289389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8513c710-eca6-4743-9c4a-9f4603f59a26-dbus\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.291424 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.291399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8513c710-eca6-4743-9c4a-9f4603f59a26-original-pull-secret\") pod \"global-pull-secret-syncer-zfltp\" (UID: \"8513c710-eca6-4743-9c4a-9f4603f59a26\") " pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.430729 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.430700 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zfltp" Apr 16 19:57:55.544598 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:55.544572 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zfltp"] Apr 16 19:57:55.546200 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:57:55.546174 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8513c710_eca6_4743_9c4a_9f4603f59a26.slice/crio-9504342c4b61aae0fb2de47ea205d16bfb53368617403baa470b7af730b9e381 WatchSource:0}: Error finding container 9504342c4b61aae0fb2de47ea205d16bfb53368617403baa470b7af730b9e381: Status 404 returned error can't find the container with id 9504342c4b61aae0fb2de47ea205d16bfb53368617403baa470b7af730b9e381 Apr 16 19:57:56.136054 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:57:56.136004 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zfltp" event={"ID":"8513c710-eca6-4743-9c4a-9f4603f59a26","Type":"ContainerStarted","Data":"9504342c4b61aae0fb2de47ea205d16bfb53368617403baa470b7af730b9e381"} Apr 16 19:58:00.147660 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:00.147623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zfltp" event={"ID":"8513c710-eca6-4743-9c4a-9f4603f59a26","Type":"ContainerStarted","Data":"0cea14bb69afdad2f3dcfede37999b33da21a3e5d72495f3845f37ca97d2c4d9"} Apr 16 19:58:00.164230 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:00.164182 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zfltp" podStartSLOduration=1.639654063 podStartE2EDuration="5.164168128s" podCreationTimestamp="2026-04-16 19:57:55 +0000 UTC" firstStartedPulling="2026-04-16 19:57:55.54766053 +0000 UTC m=+239.748370589" lastFinishedPulling="2026-04-16 19:57:59.072174594 +0000 UTC m=+243.272884654" observedRunningTime="2026-04-16 19:58:00.162955752 +0000 UTC m=+244.363665830" watchObservedRunningTime="2026-04-16 19:58:00.164168128 +0000 UTC m=+244.364878204" Apr 16 19:58:14.100884 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.100851 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m"] Apr 16 19:58:14.104207 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.104190 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.106851 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.106820 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:58:14.107716 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.107698 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:58:14.107801 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.107726 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 19:58:14.113008 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.112984 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m"] Apr 16 19:58:14.134068 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.134043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.134165 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.134098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wc7x\" (UniqueName: \"kubernetes.io/projected/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-kube-api-access-7wc7x\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.134165 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.134123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.234974 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.234948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.235102 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.234993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wc7x\" (UniqueName: \"kubernetes.io/projected/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-kube-api-access-7wc7x\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.235102 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.235043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.235412 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.235391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.235447 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.235406 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.243063 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.243039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wc7x\" (UniqueName: \"kubernetes.io/projected/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-kube-api-access-7wc7x\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.414085 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.414057 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:14.529114 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:14.528960 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m"] Apr 16 19:58:14.531674 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:58:14.531647 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3b908b_1d11_4a7f_b4d1_5cb1da2810b3.slice/crio-6435214ce4252e181b24ee24e5a4b30e0c0e48041ee104d2418022c41ece894f WatchSource:0}: Error finding container 6435214ce4252e181b24ee24e5a4b30e0c0e48041ee104d2418022c41ece894f: Status 404 returned error can't find the container with id 6435214ce4252e181b24ee24e5a4b30e0c0e48041ee104d2418022c41ece894f Apr 16 19:58:15.192481 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:15.192392 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" event={"ID":"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3","Type":"ContainerStarted","Data":"6435214ce4252e181b24ee24e5a4b30e0c0e48041ee104d2418022c41ece894f"} Apr 16 19:58:22.215844 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:22.215809 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerID="5b9d4af0e71c9a42ce3132fc30c9e37468acffed67ab7b11dfd919a059dadd52" exitCode=0 Apr 16 19:58:22.216233 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:22.215864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" event={"ID":"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3","Type":"ContainerDied","Data":"5b9d4af0e71c9a42ce3132fc30c9e37468acffed67ab7b11dfd919a059dadd52"} Apr 16 19:58:24.226917 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:24.226891 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerID="28905ce2996b8574027dfce1ca4534859b305740c1c191cb287bc23dd997072c" exitCode=0 Apr 16 19:58:24.227313 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:24.226960 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" event={"ID":"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3","Type":"ContainerDied","Data":"28905ce2996b8574027dfce1ca4534859b305740c1c191cb287bc23dd997072c"} Apr 16 19:58:33.252680 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:33.252642 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerID="f8e23e20cf1fe27ff70f2642aabe17e054fad9617745ca61861e2594e019cb93" exitCode=0 Apr 16 19:58:33.252680 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:33.252682 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" event={"ID":"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3","Type":"ContainerDied","Data":"f8e23e20cf1fe27ff70f2642aabe17e054fad9617745ca61861e2594e019cb93"} Apr 16 19:58:34.373886 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.373864 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:34.500133 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.500099 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-util\") pod \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " Apr 16 19:58:34.500288 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.500189 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wc7x\" (UniqueName: \"kubernetes.io/projected/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-kube-api-access-7wc7x\") pod \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " Apr 16 19:58:34.500288 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.500251 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-bundle\") pod \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\" (UID: \"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3\") " Apr 16 19:58:34.500823 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.500800 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-bundle" (OuterVolumeSpecName: "bundle") pod "5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" (UID: "5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:58:34.502395 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.502370 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-kube-api-access-7wc7x" (OuterVolumeSpecName: "kube-api-access-7wc7x") pod "5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" (UID: "5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3"). InnerVolumeSpecName "kube-api-access-7wc7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:58:34.504252 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.504227 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-util" (OuterVolumeSpecName: "util") pod "5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" (UID: "5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:58:34.601300 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.601246 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:58:34.601300 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.601266 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:58:34.601300 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:34.601276 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wc7x\" (UniqueName: \"kubernetes.io/projected/5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3-kube-api-access-7wc7x\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:58:35.259724 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:35.259690 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" event={"ID":"5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3","Type":"ContainerDied","Data":"6435214ce4252e181b24ee24e5a4b30e0c0e48041ee104d2418022c41ece894f"} Apr 16 19:58:35.259724 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:35.259723 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6435214ce4252e181b24ee24e5a4b30e0c0e48041ee104d2418022c41ece894f" Apr 16 19:58:35.259915 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:35.259739 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czj64m" Apr 16 19:58:40.882580 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882545 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc"] Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882937 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="util" Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882957 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="util" Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882970 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="extract" Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882979 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="extract" Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882990 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="pull" Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.882998 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="pull" Apr 16 19:58:40.883155 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.883116 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3b908b-1d11-4a7f-b4d1-5cb1da2810b3" containerName="extract" Apr 16 19:58:40.885677 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.885658 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:40.888188 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.888166 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 19:58:40.888418 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.888400 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 19:58:40.888418 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.888410 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 19:58:40.888511 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.888436 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-9qrgh\"" Apr 16 19:58:40.897766 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.897745 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc"] Apr 16 19:58:40.951328 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.951294 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6ac2436f-03d5-4f91-bb13-f58d151326bf-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc\" (UID: \"6ac2436f-03d5-4f91-bb13-f58d151326bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:40.951430 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:40.951381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scc75\" (UniqueName: \"kubernetes.io/projected/6ac2436f-03d5-4f91-bb13-f58d151326bf-kube-api-access-scc75\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc\" (UID: \"6ac2436f-03d5-4f91-bb13-f58d151326bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:41.052401 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:41.052355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6ac2436f-03d5-4f91-bb13-f58d151326bf-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc\" (UID: \"6ac2436f-03d5-4f91-bb13-f58d151326bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:41.052591 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:41.052416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scc75\" (UniqueName: \"kubernetes.io/projected/6ac2436f-03d5-4f91-bb13-f58d151326bf-kube-api-access-scc75\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc\" (UID: \"6ac2436f-03d5-4f91-bb13-f58d151326bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:41.054810 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:41.054784 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6ac2436f-03d5-4f91-bb13-f58d151326bf-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc\" (UID: \"6ac2436f-03d5-4f91-bb13-f58d151326bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:41.060774 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:41.060751 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scc75\" (UniqueName: \"kubernetes.io/projected/6ac2436f-03d5-4f91-bb13-f58d151326bf-kube-api-access-scc75\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc\" (UID: \"6ac2436f-03d5-4f91-bb13-f58d151326bf\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:41.195645 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:41.195612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:41.311253 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:41.311191 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc"] Apr 16 19:58:41.313890 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:58:41.313859 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac2436f_03d5_4f91_bb13_f58d151326bf.slice/crio-0a7a6a04bdeb863efbf9ab29a7aa8405dcf8903a664a2750afebc5def878c2cb WatchSource:0}: Error finding container 0a7a6a04bdeb863efbf9ab29a7aa8405dcf8903a664a2750afebc5def878c2cb: Status 404 returned error can't find the container with id 0a7a6a04bdeb863efbf9ab29a7aa8405dcf8903a664a2750afebc5def878c2cb Apr 16 19:58:42.286321 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:42.286280 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" event={"ID":"6ac2436f-03d5-4f91-bb13-f58d151326bf","Type":"ContainerStarted","Data":"0a7a6a04bdeb863efbf9ab29a7aa8405dcf8903a664a2750afebc5def878c2cb"} Apr 16 19:58:44.294737 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.294701 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" event={"ID":"6ac2436f-03d5-4f91-bb13-f58d151326bf","Type":"ContainerStarted","Data":"09cf9172aa3b33abe817193c176c6f1a718e341e16a69f3197bbd1a96adbfcf7"} Apr 16 19:58:44.295122 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.294809 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:58:44.317319 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.317272 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" podStartSLOduration=1.5395266969999999 podStartE2EDuration="4.317257172s" podCreationTimestamp="2026-04-16 19:58:40 +0000 UTC" firstStartedPulling="2026-04-16 19:58:41.315382511 +0000 UTC m=+285.516092566" lastFinishedPulling="2026-04-16 19:58:44.093112981 +0000 UTC m=+288.293823041" observedRunningTime="2026-04-16 19:58:44.315750789 +0000 UTC m=+288.516460867" watchObservedRunningTime="2026-04-16 19:58:44.317257172 +0000 UTC m=+288.517967252" Apr 16 19:58:44.804107 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.804076 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nlkcs"] Apr 16 19:58:44.806213 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.806195 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.809910 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.809890 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 19:58:44.810058 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.809996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 19:58:44.810272 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.810253 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-mtwz6\"" Apr 16 19:58:44.822983 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.822960 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nlkcs"] Apr 16 19:58:44.886954 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.886918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.887162 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.886999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0ed8b4e3-9ed2-4665-901a-505bc23141b9-cabundle0\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.887162 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.887060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8vwg\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-kube-api-access-v8vwg\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.987814 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.987781 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.988000 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.987846 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0ed8b4e3-9ed2-4665-901a-505bc23141b9-cabundle0\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.988000 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.987879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8vwg\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-kube-api-access-v8vwg\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.988000 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:44.987953 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 19:58:44.988000 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:44.987981 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:44.988000 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:44.987993 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:44.988284 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:44.988026 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nlkcs: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 19:58:44.988284 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:44.988090 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates podName:0ed8b4e3-9ed2-4665-901a-505bc23141b9 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:45.48807039 +0000 UTC m=+289.688780452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates") pod "keda-operator-ffbb595cb-nlkcs" (UID: "0ed8b4e3-9ed2-4665-901a-505bc23141b9") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 19:58:44.988487 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.988468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0ed8b4e3-9ed2-4665-901a-505bc23141b9-cabundle0\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:44.996853 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:44.996819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8vwg\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-kube-api-access-v8vwg\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:45.114849 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.114768 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd"] Apr 16 19:58:45.116885 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.116864 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.119941 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.119917 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 19:58:45.128373 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.128351 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd"] Apr 16 19:58:45.189939 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.189911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/9ae08c77-4dbf-4ced-829e-3971549988b7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.190101 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.189954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.190101 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.190006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslld\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-kube-api-access-mslld\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.290483 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.290450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/9ae08c77-4dbf-4ced-829e-3971549988b7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.290683 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.290502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.290683 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.290545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mslld\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-kube-api-access-mslld\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.290683 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.290662 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:45.290683 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.290682 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:45.290885 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.290703 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd: references non-existent secret key: tls.crt Apr 16 19:58:45.290885 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.290755 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates podName:9ae08c77-4dbf-4ced-829e-3971549988b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:45.790739449 +0000 UTC m=+289.991449505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates") pod "keda-metrics-apiserver-7c9f485588-mxnwd" (UID: "9ae08c77-4dbf-4ced-829e-3971549988b7") : references non-existent secret key: tls.crt Apr 16 19:58:45.290885 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.290828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/9ae08c77-4dbf-4ced-829e-3971549988b7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.303369 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.303340 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslld\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-kube-api-access-mslld\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.426065 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.426023 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-bnm2t"] Apr 16 19:58:45.428477 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.428459 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:45.431733 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.431707 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 19:58:45.437466 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.437435 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bnm2t"] Apr 16 19:58:45.492061 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.491930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:45.492061 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.492047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-certificates\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:45.492276 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.492076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt8ln\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-kube-api-access-qt8ln\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:45.492276 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.492132 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:45.492276 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.492158 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:45.492276 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.492168 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nlkcs: references non-existent secret key: ca.crt Apr 16 19:58:45.492276 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.492219 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates podName:0ed8b4e3-9ed2-4665-901a-505bc23141b9 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:46.492201257 +0000 UTC m=+290.692911314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates") pod "keda-operator-ffbb595cb-nlkcs" (UID: "0ed8b4e3-9ed2-4665-901a-505bc23141b9") : references non-existent secret key: ca.crt Apr 16 19:58:45.592541 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.592512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-certificates\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:45.592691 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.592549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt8ln\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-kube-api-access-qt8ln\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:45.592691 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.592665 2569 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 19:58:45.592691 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.592688 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-bnm2t: secret "keda-admission-webhooks-certs" not found Apr 16 19:58:45.592837 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.592737 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-certificates podName:57d359c9-5cec-42cf-8056-5141594658eb nodeName:}" failed. No retries permitted until 2026-04-16 19:58:46.092720707 +0000 UTC m=+290.293430766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-certificates") pod "keda-admission-cf49989db-bnm2t" (UID: "57d359c9-5cec-42cf-8056-5141594658eb") : secret "keda-admission-webhooks-certs" not found Apr 16 19:58:45.603093 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.603067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt8ln\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-kube-api-access-qt8ln\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:45.794573 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:45.794491 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:45.794725 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.794650 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:45.794725 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.794672 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:45.794725 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.794698 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd: references non-existent secret key: tls.crt Apr 16 19:58:45.794884 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:45.794762 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates podName:9ae08c77-4dbf-4ced-829e-3971549988b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:46.79474204 +0000 UTC m=+290.995452098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates") pod "keda-metrics-apiserver-7c9f485588-mxnwd" (UID: "9ae08c77-4dbf-4ced-829e-3971549988b7") : references non-existent secret key: tls.crt Apr 16 19:58:46.098633 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:46.098552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-certificates\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:46.101004 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:46.100979 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57d359c9-5cec-42cf-8056-5141594658eb-certificates\") pod \"keda-admission-cf49989db-bnm2t\" (UID: \"57d359c9-5cec-42cf-8056-5141594658eb\") " pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:46.340063 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:46.340030 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:46.468517 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:46.468489 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bnm2t"] Apr 16 19:58:46.470398 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:58:46.470369 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d359c9_5cec_42cf_8056_5141594658eb.slice/crio-16a0aad919f5b23548d7612e122b14767f62193e234c7abe04cf781ebed54066 WatchSource:0}: Error finding container 16a0aad919f5b23548d7612e122b14767f62193e234c7abe04cf781ebed54066: Status 404 returned error can't find the container with id 16a0aad919f5b23548d7612e122b14767f62193e234c7abe04cf781ebed54066 Apr 16 19:58:46.500776 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:46.500749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:46.500910 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.500891 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:46.500948 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.500913 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:46.500948 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.500923 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nlkcs: references non-existent secret key: ca.crt Apr 16 19:58:46.501028 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.500975 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates podName:0ed8b4e3-9ed2-4665-901a-505bc23141b9 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:48.500955032 +0000 UTC m=+292.701665088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates") pod "keda-operator-ffbb595cb-nlkcs" (UID: "0ed8b4e3-9ed2-4665-901a-505bc23141b9") : references non-existent secret key: ca.crt Apr 16 19:58:46.803121 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:46.803082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:46.803299 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.803225 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:46.803299 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.803246 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:46.803299 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.803265 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd: references non-existent secret key: tls.crt Apr 16 19:58:46.803396 ip-10-0-138-142 kubenswrapper[2569]: E0416 19:58:46.803319 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates podName:9ae08c77-4dbf-4ced-829e-3971549988b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:48.803305388 +0000 UTC m=+293.004015444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates") pod "keda-metrics-apiserver-7c9f485588-mxnwd" (UID: "9ae08c77-4dbf-4ced-829e-3971549988b7") : references non-existent secret key: tls.crt Apr 16 19:58:47.304482 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:47.304439 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bnm2t" event={"ID":"57d359c9-5cec-42cf-8056-5141594658eb","Type":"ContainerStarted","Data":"16a0aad919f5b23548d7612e122b14767f62193e234c7abe04cf781ebed54066"} Apr 16 19:58:48.308094 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.308053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bnm2t" event={"ID":"57d359c9-5cec-42cf-8056-5141594658eb","Type":"ContainerStarted","Data":"356eedf11734387d039589f11cd27a956c34ef965076f5b4fbd94031480228ee"} Apr 16 19:58:48.308476 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.308117 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:58:48.336978 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.336932 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-bnm2t" podStartSLOduration=1.986005412 podStartE2EDuration="3.336918454s" podCreationTimestamp="2026-04-16 19:58:45 +0000 UTC" firstStartedPulling="2026-04-16 19:58:46.471642707 +0000 UTC m=+290.672352763" lastFinishedPulling="2026-04-16 19:58:47.82255575 +0000 UTC m=+292.023265805" observedRunningTime="2026-04-16 19:58:48.335186336 +0000 UTC m=+292.535896568" watchObservedRunningTime="2026-04-16 19:58:48.336918454 +0000 UTC m=+292.537628535" Apr 16 19:58:48.518863 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.518814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:48.521304 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.521279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ed8b4e3-9ed2-4665-901a-505bc23141b9-certificates\") pod \"keda-operator-ffbb595cb-nlkcs\" (UID: \"0ed8b4e3-9ed2-4665-901a-505bc23141b9\") " pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:48.716318 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.716263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:48.821620 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.821580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:48.824442 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.824410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ae08c77-4dbf-4ced-829e-3971549988b7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mxnwd\" (UID: \"9ae08c77-4dbf-4ced-829e-3971549988b7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:48.860297 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:48.860267 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nlkcs"] Apr 16 19:58:48.869506 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:58:48.869477 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed8b4e3_9ed2_4665_901a_505bc23141b9.slice/crio-d5c9a58064d1cd4056a6efdc78dbd6e8dda3a1ca7566b37aa8337025943e340c WatchSource:0}: Error finding container d5c9a58064d1cd4056a6efdc78dbd6e8dda3a1ca7566b37aa8337025943e340c: Status 404 returned error can't find the container with id d5c9a58064d1cd4056a6efdc78dbd6e8dda3a1ca7566b37aa8337025943e340c Apr 16 19:58:49.028234 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:49.028153 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:49.151209 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:49.151142 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd"] Apr 16 19:58:49.153148 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:58:49.153117 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ae08c77_4dbf_4ced_829e_3971549988b7.slice/crio-628b6e3cf9d13cb86a6341144eabaf26cbe906182e890eac6fbd64e23c4a4602 WatchSource:0}: Error finding container 628b6e3cf9d13cb86a6341144eabaf26cbe906182e890eac6fbd64e23c4a4602: Status 404 returned error can't find the container with id 628b6e3cf9d13cb86a6341144eabaf26cbe906182e890eac6fbd64e23c4a4602 Apr 16 19:58:49.311699 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:49.311625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" event={"ID":"9ae08c77-4dbf-4ced-829e-3971549988b7","Type":"ContainerStarted","Data":"628b6e3cf9d13cb86a6341144eabaf26cbe906182e890eac6fbd64e23c4a4602"} Apr 16 19:58:49.312587 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:49.312563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" event={"ID":"0ed8b4e3-9ed2-4665-901a-505bc23141b9","Type":"ContainerStarted","Data":"d5c9a58064d1cd4056a6efdc78dbd6e8dda3a1ca7566b37aa8337025943e340c"} Apr 16 19:58:53.325751 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:53.325706 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" event={"ID":"9ae08c77-4dbf-4ced-829e-3971549988b7","Type":"ContainerStarted","Data":"5e2779b88c09f74773e96899695d71b7f7a1a7aa0b91818ff789ac19a05a8ed3"} Apr 16 19:58:53.326229 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:53.325775 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:58:53.326994 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:53.326974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" event={"ID":"0ed8b4e3-9ed2-4665-901a-505bc23141b9","Type":"ContainerStarted","Data":"d9caf4543534bfa0aef392126dea333d0b6cb270f695b0465c1823c1f41f4ace"} Apr 16 19:58:53.327091 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:53.327080 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:58:53.346499 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:53.346456 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" podStartSLOduration=4.864102505 podStartE2EDuration="8.346440762s" podCreationTimestamp="2026-04-16 19:58:45 +0000 UTC" firstStartedPulling="2026-04-16 19:58:49.154450424 +0000 UTC m=+293.355160481" lastFinishedPulling="2026-04-16 19:58:52.636788666 +0000 UTC m=+296.837498738" observedRunningTime="2026-04-16 19:58:53.345726434 +0000 UTC m=+297.546436525" watchObservedRunningTime="2026-04-16 19:58:53.346440762 +0000 UTC m=+297.547150844" Apr 16 19:58:53.364522 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:53.364486 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" podStartSLOduration=5.598370224 podStartE2EDuration="9.364471303s" podCreationTimestamp="2026-04-16 19:58:44 +0000 UTC" firstStartedPulling="2026-04-16 19:58:48.870963352 +0000 UTC m=+293.071673421" lastFinishedPulling="2026-04-16 19:58:52.63706443 +0000 UTC m=+296.837774500" observedRunningTime="2026-04-16 19:58:53.363384037 +0000 UTC m=+297.564094113" watchObservedRunningTime="2026-04-16 19:58:53.364471303 +0000 UTC m=+297.565181380" Apr 16 19:58:56.262453 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:56.262422 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 19:58:56.263432 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:56.263396 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 19:58:56.269138 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:58:56.269117 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:59:04.336032 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:04.335987 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mxnwd" Apr 16 19:59:05.299625 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:05.299592 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9rxjc" Apr 16 19:59:09.315062 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:09.315006 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-bnm2t" Apr 16 19:59:14.334653 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:14.334620 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-nlkcs" Apr 16 19:59:39.577028 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.576984 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5"] Apr 16 19:59:39.587816 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.587782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.590843 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.590819 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:59:39.590999 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.590821 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:59:39.591788 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.591770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 19:59:39.597579 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.597558 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5"] Apr 16 19:59:39.717292 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.717253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wnpr\" (UniqueName: \"kubernetes.io/projected/5d08cc2e-908a-40a1-b45f-0c6c59775a77-kube-api-access-8wnpr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.717480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.717316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.717480 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.717346 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.818661 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.818620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wnpr\" (UniqueName: \"kubernetes.io/projected/5d08cc2e-908a-40a1-b45f-0c6c59775a77-kube-api-access-8wnpr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.818852 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.818683 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.818852 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.818713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.819143 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.819123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.819143 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.819136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.827989 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.827944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wnpr\" (UniqueName: \"kubernetes.io/projected/5d08cc2e-908a-40a1-b45f-0c6c59775a77-kube-api-access-8wnpr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:39.898001 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:39.897965 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:40.020874 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:40.020811 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5"] Apr 16 19:59:40.023114 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:59:40.023082 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d08cc2e_908a_40a1_b45f_0c6c59775a77.slice/crio-1cfcaa0541078d1c42dd8a20db787abf836c557b2ab1ef5316254d24b7eafed9 WatchSource:0}: Error finding container 1cfcaa0541078d1c42dd8a20db787abf836c557b2ab1ef5316254d24b7eafed9: Status 404 returned error can't find the container with id 1cfcaa0541078d1c42dd8a20db787abf836c557b2ab1ef5316254d24b7eafed9 Apr 16 19:59:40.024924 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:40.024906 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:59:40.466507 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:40.466474 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerID="ba0243acd0be6f50676502e994811e59601e9a81bc879c7d4fc82eca0a7eb37a" exitCode=0 Apr 16 19:59:40.466685 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:40.466571 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" event={"ID":"5d08cc2e-908a-40a1-b45f-0c6c59775a77","Type":"ContainerDied","Data":"ba0243acd0be6f50676502e994811e59601e9a81bc879c7d4fc82eca0a7eb37a"} Apr 16 19:59:40.466685 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:40.466613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" event={"ID":"5d08cc2e-908a-40a1-b45f-0c6c59775a77","Type":"ContainerStarted","Data":"1cfcaa0541078d1c42dd8a20db787abf836c557b2ab1ef5316254d24b7eafed9"} Apr 16 19:59:41.471634 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:41.471602 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerID="1a23434fbdc05c928465a1efba7e0f43f00bb252a65ea386127f617a9d98b226" exitCode=0 Apr 16 19:59:41.472091 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:41.471654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" event={"ID":"5d08cc2e-908a-40a1-b45f-0c6c59775a77","Type":"ContainerDied","Data":"1a23434fbdc05c928465a1efba7e0f43f00bb252a65ea386127f617a9d98b226"} Apr 16 19:59:42.475951 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:42.475911 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerID="fb481acb587bdb16a46c5530b191f90096704e314d77b125dd018e4ccf3891bc" exitCode=0 Apr 16 19:59:42.476345 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:42.475961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" event={"ID":"5d08cc2e-908a-40a1-b45f-0c6c59775a77","Type":"ContainerDied","Data":"fb481acb587bdb16a46c5530b191f90096704e314d77b125dd018e4ccf3891bc"} Apr 16 19:59:43.593765 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.593743 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:43.649888 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.649858 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-bundle\") pod \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " Apr 16 19:59:43.650078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.649912 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-util\") pod \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " Apr 16 19:59:43.650078 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.649988 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wnpr\" (UniqueName: \"kubernetes.io/projected/5d08cc2e-908a-40a1-b45f-0c6c59775a77-kube-api-access-8wnpr\") pod \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\" (UID: \"5d08cc2e-908a-40a1-b45f-0c6c59775a77\") " Apr 16 19:59:43.655788 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.650967 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-bundle" (OuterVolumeSpecName: "bundle") pod "5d08cc2e-908a-40a1-b45f-0c6c59775a77" (UID: "5d08cc2e-908a-40a1-b45f-0c6c59775a77"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:59:43.655788 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.652832 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d08cc2e-908a-40a1-b45f-0c6c59775a77-kube-api-access-8wnpr" (OuterVolumeSpecName: "kube-api-access-8wnpr") pod "5d08cc2e-908a-40a1-b45f-0c6c59775a77" (UID: "5d08cc2e-908a-40a1-b45f-0c6c59775a77"). InnerVolumeSpecName "kube-api-access-8wnpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:59:43.660844 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.660821 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-util" (OuterVolumeSpecName: "util") pod "5d08cc2e-908a-40a1-b45f-0c6c59775a77" (UID: "5d08cc2e-908a-40a1-b45f-0c6c59775a77"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:59:43.751220 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.751139 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8wnpr\" (UniqueName: \"kubernetes.io/projected/5d08cc2e-908a-40a1-b45f-0c6c59775a77-kube-api-access-8wnpr\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:59:43.751220 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.751171 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:59:43.751220 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:43.751185 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d08cc2e-908a-40a1-b45f-0c6c59775a77-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 19:59:44.483096 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:44.483056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" event={"ID":"5d08cc2e-908a-40a1-b45f-0c6c59775a77","Type":"ContainerDied","Data":"1cfcaa0541078d1c42dd8a20db787abf836c557b2ab1ef5316254d24b7eafed9"} Apr 16 19:59:44.483096 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:44.483091 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s6sq5" Apr 16 19:59:44.483096 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:44.483100 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cfcaa0541078d1c42dd8a20db787abf836c557b2ab1ef5316254d24b7eafed9" Apr 16 19:59:52.044522 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044445 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8"] Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044718 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="util" Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044728 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="util" Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044737 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="pull" Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044744 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="pull" Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044758 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="extract" Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044763 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="extract" Apr 16 19:59:52.044960 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.044808 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d08cc2e-908a-40a1-b45f-0c6c59775a77" containerName="extract" Apr 16 19:59:52.047778 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.047761 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.050466 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.050447 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 19:59:52.051605 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.051589 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-mdmtz\"" Apr 16 19:59:52.054130 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.054114 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:59:52.079345 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.079320 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8"] Apr 16 19:59:52.223756 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.223709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e966b9-45d2-4020-a475-6b0c60e4ee38-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s27g8\" (UID: \"b7e966b9-45d2-4020-a475-6b0c60e4ee38\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.223756 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.223767 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dwqz\" (UniqueName: \"kubernetes.io/projected/b7e966b9-45d2-4020-a475-6b0c60e4ee38-kube-api-access-8dwqz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s27g8\" (UID: \"b7e966b9-45d2-4020-a475-6b0c60e4ee38\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.325217 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.325125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dwqz\" (UniqueName: \"kubernetes.io/projected/b7e966b9-45d2-4020-a475-6b0c60e4ee38-kube-api-access-8dwqz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s27g8\" (UID: \"b7e966b9-45d2-4020-a475-6b0c60e4ee38\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.325354 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.325239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e966b9-45d2-4020-a475-6b0c60e4ee38-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s27g8\" (UID: \"b7e966b9-45d2-4020-a475-6b0c60e4ee38\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.325604 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.325584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e966b9-45d2-4020-a475-6b0c60e4ee38-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s27g8\" (UID: \"b7e966b9-45d2-4020-a475-6b0c60e4ee38\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.337375 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.337353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dwqz\" (UniqueName: \"kubernetes.io/projected/b7e966b9-45d2-4020-a475-6b0c60e4ee38-kube-api-access-8dwqz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s27g8\" (UID: \"b7e966b9-45d2-4020-a475-6b0c60e4ee38\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.356348 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.356319 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" Apr 16 19:59:52.484006 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.483965 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8"] Apr 16 19:59:52.486340 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:59:52.486306 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e966b9_45d2_4020_a475_6b0c60e4ee38.slice/crio-667e52ba4d3bcc5f1049bec9d300333845b6bc565a2e2853d49437bfe5a53c76 WatchSource:0}: Error finding container 667e52ba4d3bcc5f1049bec9d300333845b6bc565a2e2853d49437bfe5a53c76: Status 404 returned error can't find the container with id 667e52ba4d3bcc5f1049bec9d300333845b6bc565a2e2853d49437bfe5a53c76 Apr 16 19:59:52.507183 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:52.507150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" event={"ID":"b7e966b9-45d2-4020-a475-6b0c60e4ee38","Type":"ContainerStarted","Data":"667e52ba4d3bcc5f1049bec9d300333845b6bc565a2e2853d49437bfe5a53c76"} Apr 16 19:59:54.516160 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:54.516122 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" event={"ID":"b7e966b9-45d2-4020-a475-6b0c60e4ee38","Type":"ContainerStarted","Data":"ee0c9186a7bade79388d5e258941d39783ab9df13bc8e94346c8d812049933fa"} Apr 16 19:59:54.538874 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:54.538821 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s27g8" podStartSLOduration=0.649331589 podStartE2EDuration="2.53880616s" podCreationTimestamp="2026-04-16 19:59:52 +0000 UTC" firstStartedPulling="2026-04-16 19:59:52.488811431 +0000 UTC m=+356.689521487" lastFinishedPulling="2026-04-16 19:59:54.378286 +0000 UTC m=+358.578996058" observedRunningTime="2026-04-16 19:59:54.535918039 +0000 UTC m=+358.736628117" watchObservedRunningTime="2026-04-16 19:59:54.53880616 +0000 UTC m=+358.739516240" Apr 16 19:59:55.969100 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:55.969066 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx"] Apr 16 19:59:55.972537 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:55.972519 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:55.976325 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:55.975755 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 19:59:55.976325 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:55.975755 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:59:55.976325 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:55.975762 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:59:55.984031 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:55.983989 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx"] Apr 16 19:59:56.054910 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.054870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlq52\" (UniqueName: \"kubernetes.io/projected/94d4c6ee-a040-4445-a3cb-3deb9f917640-kube-api-access-nlq52\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.055117 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.054930 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.055117 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.054964 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.156407 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.156366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.156577 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.156464 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlq52\" (UniqueName: \"kubernetes.io/projected/94d4c6ee-a040-4445-a3cb-3deb9f917640-kube-api-access-nlq52\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.156620 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.156596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.156825 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.156805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.156912 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.156895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.165547 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.165517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlq52\" (UniqueName: \"kubernetes.io/projected/94d4c6ee-a040-4445-a3cb-3deb9f917640-kube-api-access-nlq52\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.288707 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.288628 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 19:59:56.297000 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.296972 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 19:59:56.438997 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.438967 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx"] Apr 16 19:59:56.440580 ip-10-0-138-142 kubenswrapper[2569]: W0416 19:59:56.440551 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d4c6ee_a040_4445_a3cb_3deb9f917640.slice/crio-822e6a8c559e4b34e674ca552485188167fc2066924ed78385df033eb4d648f3 WatchSource:0}: Error finding container 822e6a8c559e4b34e674ca552485188167fc2066924ed78385df033eb4d648f3: Status 404 returned error can't find the container with id 822e6a8c559e4b34e674ca552485188167fc2066924ed78385df033eb4d648f3 Apr 16 19:59:56.524411 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.524374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" event={"ID":"94d4c6ee-a040-4445-a3cb-3deb9f917640","Type":"ContainerStarted","Data":"86f86f92f481fdbbcdea8eb39bf053e03c59b291348bf650cc58ff6e8494c67e"} Apr 16 19:59:56.524411 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:56.524418 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" event={"ID":"94d4c6ee-a040-4445-a3cb-3deb9f917640","Type":"ContainerStarted","Data":"822e6a8c559e4b34e674ca552485188167fc2066924ed78385df033eb4d648f3"} Apr 16 19:59:57.528608 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:57.528530 2569 generic.go:358] "Generic (PLEG): container finished" podID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerID="86f86f92f481fdbbcdea8eb39bf053e03c59b291348bf650cc58ff6e8494c67e" exitCode=0 Apr 16 19:59:57.528952 ip-10-0-138-142 kubenswrapper[2569]: I0416 19:59:57.528613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" event={"ID":"94d4c6ee-a040-4445-a3cb-3deb9f917640","Type":"ContainerDied","Data":"86f86f92f481fdbbcdea8eb39bf053e03c59b291348bf650cc58ff6e8494c67e"} Apr 16 20:00:00.539338 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:00.539299 2569 generic.go:358] "Generic (PLEG): container finished" podID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerID="d89e29d0def44540429ead35168c7f403168d9d9fc78777bb85f35262a0b01ee" exitCode=0 Apr 16 20:00:00.539720 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:00.539384 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" event={"ID":"94d4c6ee-a040-4445-a3cb-3deb9f917640","Type":"ContainerDied","Data":"d89e29d0def44540429ead35168c7f403168d9d9fc78777bb85f35262a0b01ee"} Apr 16 20:00:01.546755 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:01.546722 2569 generic.go:358] "Generic (PLEG): container finished" podID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerID="5f317691222b45ebefc1f1f9d20d910c03ff39218c4408de66ad0586d7c041cc" exitCode=0 Apr 16 20:00:01.547138 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:01.546824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" event={"ID":"94d4c6ee-a040-4445-a3cb-3deb9f917640","Type":"ContainerDied","Data":"5f317691222b45ebefc1f1f9d20d910c03ff39218c4408de66ad0586d7c041cc"} Apr 16 20:00:02.667775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.667754 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 20:00:02.819216 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.819121 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-bundle\") pod \"94d4c6ee-a040-4445-a3cb-3deb9f917640\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " Apr 16 20:00:02.819216 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.819211 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlq52\" (UniqueName: \"kubernetes.io/projected/94d4c6ee-a040-4445-a3cb-3deb9f917640-kube-api-access-nlq52\") pod \"94d4c6ee-a040-4445-a3cb-3deb9f917640\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " Apr 16 20:00:02.819462 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.819233 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-util\") pod \"94d4c6ee-a040-4445-a3cb-3deb9f917640\" (UID: \"94d4c6ee-a040-4445-a3cb-3deb9f917640\") " Apr 16 20:00:02.819600 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.819574 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-bundle" (OuterVolumeSpecName: "bundle") pod "94d4c6ee-a040-4445-a3cb-3deb9f917640" (UID: "94d4c6ee-a040-4445-a3cb-3deb9f917640"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:00:02.821324 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.821301 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d4c6ee-a040-4445-a3cb-3deb9f917640-kube-api-access-nlq52" (OuterVolumeSpecName: "kube-api-access-nlq52") pod "94d4c6ee-a040-4445-a3cb-3deb9f917640" (UID: "94d4c6ee-a040-4445-a3cb-3deb9f917640"). InnerVolumeSpecName "kube-api-access-nlq52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:00:02.823466 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.823439 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-util" (OuterVolumeSpecName: "util") pod "94d4c6ee-a040-4445-a3cb-3deb9f917640" (UID: "94d4c6ee-a040-4445-a3cb-3deb9f917640"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:00:02.919920 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.919879 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:02.919920 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.919916 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlq52\" (UniqueName: \"kubernetes.io/projected/94d4c6ee-a040-4445-a3cb-3deb9f917640-kube-api-access-nlq52\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:02.920142 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:02.919931 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d4c6ee-a040-4445-a3cb-3deb9f917640-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:03.555247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:03.555215 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" event={"ID":"94d4c6ee-a040-4445-a3cb-3deb9f917640","Type":"ContainerDied","Data":"822e6a8c559e4b34e674ca552485188167fc2066924ed78385df033eb4d648f3"} Apr 16 20:00:03.555247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:03.555250 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="822e6a8c559e4b34e674ca552485188167fc2066924ed78385df033eb4d648f3" Apr 16 20:00:03.555452 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:03.555268 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f62hfx" Apr 16 20:00:07.794068 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794031 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-zxhxq"] Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794313 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="extract" Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794323 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="extract" Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794335 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="pull" Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794340 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="pull" Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794357 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="util" Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794364 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="util" Apr 16 20:00:07.794445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.794411 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="94d4c6ee-a040-4445-a3cb-3deb9f917640" containerName="extract" Apr 16 20:00:07.796919 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.796901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:07.799408 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.799383 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:00:07.799408 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.799401 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:00:07.800235 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.800218 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-v62bc\"" Apr 16 20:00:07.805220 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.805192 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-zxhxq"] Apr 16 20:00:07.858170 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.858143 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xp5\" (UniqueName: \"kubernetes.io/projected/87c46364-344c-44d0-a2b0-93d630b49986-kube-api-access-r8xp5\") pod \"cert-manager-759f64656b-zxhxq\" (UID: \"87c46364-344c-44d0-a2b0-93d630b49986\") " pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:07.858282 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.858203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87c46364-344c-44d0-a2b0-93d630b49986-bound-sa-token\") pod \"cert-manager-759f64656b-zxhxq\" (UID: \"87c46364-344c-44d0-a2b0-93d630b49986\") " pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:07.959494 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.959457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xp5\" (UniqueName: \"kubernetes.io/projected/87c46364-344c-44d0-a2b0-93d630b49986-kube-api-access-r8xp5\") pod \"cert-manager-759f64656b-zxhxq\" (UID: \"87c46364-344c-44d0-a2b0-93d630b49986\") " pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:07.959650 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.959546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87c46364-344c-44d0-a2b0-93d630b49986-bound-sa-token\") pod \"cert-manager-759f64656b-zxhxq\" (UID: \"87c46364-344c-44d0-a2b0-93d630b49986\") " pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:07.969301 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.969257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87c46364-344c-44d0-a2b0-93d630b49986-bound-sa-token\") pod \"cert-manager-759f64656b-zxhxq\" (UID: \"87c46364-344c-44d0-a2b0-93d630b49986\") " pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:07.969423 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:07.969316 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xp5\" (UniqueName: \"kubernetes.io/projected/87c46364-344c-44d0-a2b0-93d630b49986-kube-api-access-r8xp5\") pod \"cert-manager-759f64656b-zxhxq\" (UID: \"87c46364-344c-44d0-a2b0-93d630b49986\") " pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:08.123079 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:08.122967 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-zxhxq" Apr 16 20:00:08.239741 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:08.239716 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-zxhxq"] Apr 16 20:00:08.242072 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:00:08.241998 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87c46364_344c_44d0_a2b0_93d630b49986.slice/crio-55bfc755be5fa73816839fbe1800d4af03c9f4f9870ee8a830aa5f19dc701235 WatchSource:0}: Error finding container 55bfc755be5fa73816839fbe1800d4af03c9f4f9870ee8a830aa5f19dc701235: Status 404 returned error can't find the container with id 55bfc755be5fa73816839fbe1800d4af03c9f4f9870ee8a830aa5f19dc701235 Apr 16 20:00:08.570689 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:08.570655 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-zxhxq" event={"ID":"87c46364-344c-44d0-a2b0-93d630b49986","Type":"ContainerStarted","Data":"55bfc755be5fa73816839fbe1800d4af03c9f4f9870ee8a830aa5f19dc701235"} Apr 16 20:00:14.592619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:14.592531 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-zxhxq" event={"ID":"87c46364-344c-44d0-a2b0-93d630b49986","Type":"ContainerStarted","Data":"2d896f699d27c8cea05881c6fda6ae0f9df7086f3c568d9d0370eb3a6cee09bc"} Apr 16 20:00:26.471988 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.471928 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-zxhxq" podStartSLOduration=13.46780365 podStartE2EDuration="19.471911013s" podCreationTimestamp="2026-04-16 20:00:07 +0000 UTC" firstStartedPulling="2026-04-16 20:00:08.243815304 +0000 UTC m=+372.444525360" lastFinishedPulling="2026-04-16 20:00:14.247922667 +0000 UTC m=+378.448632723" observedRunningTime="2026-04-16 20:00:14.608658169 +0000 UTC m=+378.809368246" watchObservedRunningTime="2026-04-16 20:00:26.471911013 +0000 UTC m=+390.672621090" Apr 16 20:00:26.472401 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.472361 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh"] Apr 16 20:00:26.475679 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.475662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.478185 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.478158 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:00:26.478294 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.478167 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:00:26.479077 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.479055 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 20:00:26.483136 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.482806 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh"] Apr 16 20:00:26.614714 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.614660 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wm8\" (UniqueName: \"kubernetes.io/projected/6f308108-d341-495c-984a-385dded898eb-kube-api-access-s6wm8\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.614922 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.614750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.614922 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.614802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.715514 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.715474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.715703 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.715541 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wm8\" (UniqueName: \"kubernetes.io/projected/6f308108-d341-495c-984a-385dded898eb-kube-api-access-s6wm8\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.715703 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.715580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.715900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.715877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.715963 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.715900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.731379 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.731312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wm8\" (UniqueName: \"kubernetes.io/projected/6f308108-d341-495c-984a-385dded898eb-kube-api-access-s6wm8\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.786004 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.785954 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:26.908080 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:26.908051 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh"] Apr 16 20:00:26.909992 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:00:26.909965 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f308108_d341_495c_984a_385dded898eb.slice/crio-0b37c5f5f37b4a9867d4b3269c1c74f2b1fd242d525e5c91596f2de2f3e42dcf WatchSource:0}: Error finding container 0b37c5f5f37b4a9867d4b3269c1c74f2b1fd242d525e5c91596f2de2f3e42dcf: Status 404 returned error can't find the container with id 0b37c5f5f37b4a9867d4b3269c1c74f2b1fd242d525e5c91596f2de2f3e42dcf Apr 16 20:00:27.636879 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:27.636844 2569 generic.go:358] "Generic (PLEG): container finished" podID="6f308108-d341-495c-984a-385dded898eb" containerID="6f8cf0c2e54d7cf9a43ed41b57de0e4500c5ef92cc439b07c3ba3d53e840b2ec" exitCode=0 Apr 16 20:00:27.637284 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:27.636937 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" event={"ID":"6f308108-d341-495c-984a-385dded898eb","Type":"ContainerDied","Data":"6f8cf0c2e54d7cf9a43ed41b57de0e4500c5ef92cc439b07c3ba3d53e840b2ec"} Apr 16 20:00:27.637284 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:27.636977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" event={"ID":"6f308108-d341-495c-984a-385dded898eb","Type":"ContainerStarted","Data":"0b37c5f5f37b4a9867d4b3269c1c74f2b1fd242d525e5c91596f2de2f3e42dcf"} Apr 16 20:00:29.645773 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:29.645743 2569 generic.go:358] "Generic (PLEG): container finished" podID="6f308108-d341-495c-984a-385dded898eb" containerID="bec50e41615603f18b7105f130cb5c304fa7920d9b2c52e2aba1969ea85e80bb" exitCode=0 Apr 16 20:00:29.646225 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:29.645782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" event={"ID":"6f308108-d341-495c-984a-385dded898eb","Type":"ContainerDied","Data":"bec50e41615603f18b7105f130cb5c304fa7920d9b2c52e2aba1969ea85e80bb"} Apr 16 20:00:30.655780 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:30.655744 2569 generic.go:358] "Generic (PLEG): container finished" podID="6f308108-d341-495c-984a-385dded898eb" containerID="9f79ec34df99aec1b5f6a1d380866785bd2cca2b8ff223f0bc11b29127d55ba3" exitCode=0 Apr 16 20:00:30.656173 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:30.655803 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" event={"ID":"6f308108-d341-495c-984a-385dded898eb","Type":"ContainerDied","Data":"9f79ec34df99aec1b5f6a1d380866785bd2cca2b8ff223f0bc11b29127d55ba3"} Apr 16 20:00:31.778852 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.778827 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:31.863625 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.863590 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-util\") pod \"6f308108-d341-495c-984a-385dded898eb\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " Apr 16 20:00:31.863865 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.863653 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-bundle\") pod \"6f308108-d341-495c-984a-385dded898eb\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " Apr 16 20:00:31.863865 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.863686 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6wm8\" (UniqueName: \"kubernetes.io/projected/6f308108-d341-495c-984a-385dded898eb-kube-api-access-s6wm8\") pod \"6f308108-d341-495c-984a-385dded898eb\" (UID: \"6f308108-d341-495c-984a-385dded898eb\") " Apr 16 20:00:31.864417 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.864387 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-bundle" (OuterVolumeSpecName: "bundle") pod "6f308108-d341-495c-984a-385dded898eb" (UID: "6f308108-d341-495c-984a-385dded898eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:00:31.865722 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.865696 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f308108-d341-495c-984a-385dded898eb-kube-api-access-s6wm8" (OuterVolumeSpecName: "kube-api-access-s6wm8") pod "6f308108-d341-495c-984a-385dded898eb" (UID: "6f308108-d341-495c-984a-385dded898eb"). InnerVolumeSpecName "kube-api-access-s6wm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:00:31.869227 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.869190 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-util" (OuterVolumeSpecName: "util") pod "6f308108-d341-495c-984a-385dded898eb" (UID: "6f308108-d341-495c-984a-385dded898eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:00:31.964782 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.964693 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:31.964782 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.964727 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f308108-d341-495c-984a-385dded898eb-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:31.964782 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:31.964738 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6wm8\" (UniqueName: \"kubernetes.io/projected/6f308108-d341-495c-984a-385dded898eb-kube-api-access-s6wm8\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:32.663918 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:32.663886 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" event={"ID":"6f308108-d341-495c-984a-385dded898eb","Type":"ContainerDied","Data":"0b37c5f5f37b4a9867d4b3269c1c74f2b1fd242d525e5c91596f2de2f3e42dcf"} Apr 16 20:00:32.663918 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:32.663922 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b37c5f5f37b4a9867d4b3269c1c74f2b1fd242d525e5c91596f2de2f3e42dcf" Apr 16 20:00:32.664128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:32.663935 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pqmrh" Apr 16 20:00:41.583475 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583442 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr"] Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583727 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="extract" Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583737 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="extract" Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583750 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="pull" Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583756 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="pull" Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583766 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="util" Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583771 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="util" Apr 16 20:00:41.583841 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.583816 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f308108-d341-495c-984a-385dded898eb" containerName="extract" Apr 16 20:00:41.587947 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.587928 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.592718 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.592695 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 20:00:41.592837 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.592758 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:00:41.592837 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.592799 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:00:41.606271 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.606250 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr"] Apr 16 20:00:41.747258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.747225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.747456 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.747288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.747456 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.747344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglq9\" (UniqueName: \"kubernetes.io/projected/6acd1451-8d5e-424c-a81c-aaa49320f02b-kube-api-access-rglq9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.848547 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.848460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.848547 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.848509 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.848547 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.848543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rglq9\" (UniqueName: \"kubernetes.io/projected/6acd1451-8d5e-424c-a81c-aaa49320f02b-kube-api-access-rglq9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.848819 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.848799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.848853 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.848839 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.861502 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.861478 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglq9\" (UniqueName: \"kubernetes.io/projected/6acd1451-8d5e-424c-a81c-aaa49320f02b-kube-api-access-rglq9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:41.896448 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:41.896426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:42.039559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.039522 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr"] Apr 16 20:00:42.045905 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:00:42.045855 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6acd1451_8d5e_424c_a81c_aaa49320f02b.slice/crio-8e62635d0688eb857665068b78cd570ff8e8a55031168382bba84cb17baa1583 WatchSource:0}: Error finding container 8e62635d0688eb857665068b78cd570ff8e8a55031168382bba84cb17baa1583: Status 404 returned error can't find the container with id 8e62635d0688eb857665068b78cd570ff8e8a55031168382bba84cb17baa1583 Apr 16 20:00:42.698646 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.698611 2569 generic.go:358] "Generic (PLEG): container finished" podID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerID="f45a49cd77a23d0db44d50a0b68c553628b1df1d73bffb6600e9c453f3245020" exitCode=0 Apr 16 20:00:42.699043 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.698668 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" event={"ID":"6acd1451-8d5e-424c-a81c-aaa49320f02b","Type":"ContainerDied","Data":"f45a49cd77a23d0db44d50a0b68c553628b1df1d73bffb6600e9c453f3245020"} Apr 16 20:00:42.699043 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.698711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" event={"ID":"6acd1451-8d5e-424c-a81c-aaa49320f02b","Type":"ContainerStarted","Data":"8e62635d0688eb857665068b78cd570ff8e8a55031168382bba84cb17baa1583"} Apr 16 20:00:42.792144 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.792110 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jm679"] Apr 16 20:00:42.795265 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.795247 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:42.797713 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.797692 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 20:00:42.797897 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.797878 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-x5s4w\"" Apr 16 20:00:42.797968 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.797902 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 20:00:42.808087 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.808055 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jm679"] Apr 16 20:00:42.957047 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.956953 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6d52f4e3-bb83-41df-8b11-e82731bb2bd2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jm679\" (UID: \"6d52f4e3-bb83-41df-8b11-e82731bb2bd2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:42.957047 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:42.957029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqlx\" (UniqueName: \"kubernetes.io/projected/6d52f4e3-bb83-41df-8b11-e82731bb2bd2-kube-api-access-kwqlx\") pod \"servicemesh-operator3-55f49c5f94-jm679\" (UID: \"6d52f4e3-bb83-41df-8b11-e82731bb2bd2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:43.058419 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.058385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6d52f4e3-bb83-41df-8b11-e82731bb2bd2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jm679\" (UID: \"6d52f4e3-bb83-41df-8b11-e82731bb2bd2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:43.058567 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.058443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqlx\" (UniqueName: \"kubernetes.io/projected/6d52f4e3-bb83-41df-8b11-e82731bb2bd2-kube-api-access-kwqlx\") pod \"servicemesh-operator3-55f49c5f94-jm679\" (UID: \"6d52f4e3-bb83-41df-8b11-e82731bb2bd2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:43.060840 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.060809 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6d52f4e3-bb83-41df-8b11-e82731bb2bd2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jm679\" (UID: \"6d52f4e3-bb83-41df-8b11-e82731bb2bd2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:43.069551 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.069524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqlx\" (UniqueName: \"kubernetes.io/projected/6d52f4e3-bb83-41df-8b11-e82731bb2bd2-kube-api-access-kwqlx\") pod \"servicemesh-operator3-55f49c5f94-jm679\" (UID: \"6d52f4e3-bb83-41df-8b11-e82731bb2bd2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:43.105138 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.105116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:43.238380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.238355 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jm679"] Apr 16 20:00:43.240610 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:00:43.240586 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d52f4e3_bb83_41df_8b11_e82731bb2bd2.slice/crio-d177a4d1e0d1ed8faf46d93401c42bff5167613ba7d63683a6fe726bcd28cc4d WatchSource:0}: Error finding container d177a4d1e0d1ed8faf46d93401c42bff5167613ba7d63683a6fe726bcd28cc4d: Status 404 returned error can't find the container with id d177a4d1e0d1ed8faf46d93401c42bff5167613ba7d63683a6fe726bcd28cc4d Apr 16 20:00:43.703314 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:43.703276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" event={"ID":"6d52f4e3-bb83-41df-8b11-e82731bb2bd2","Type":"ContainerStarted","Data":"d177a4d1e0d1ed8faf46d93401c42bff5167613ba7d63683a6fe726bcd28cc4d"} Apr 16 20:00:44.709994 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:44.709917 2569 generic.go:358] "Generic (PLEG): container finished" podID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerID="d2ce10ccf2fdbad9587f5f0f707ca198759088467d6063d56084be3cee81d2fb" exitCode=0 Apr 16 20:00:44.710485 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:44.710046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" event={"ID":"6acd1451-8d5e-424c-a81c-aaa49320f02b","Type":"ContainerDied","Data":"d2ce10ccf2fdbad9587f5f0f707ca198759088467d6063d56084be3cee81d2fb"} Apr 16 20:00:45.715713 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:45.715678 2569 generic.go:358] "Generic (PLEG): container finished" podID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerID="b054640b553db5d47ed5fba418afb28bcd1b57c29698e4d2769dd067599e1573" exitCode=0 Apr 16 20:00:45.716234 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:45.715746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" event={"ID":"6acd1451-8d5e-424c-a81c-aaa49320f02b","Type":"ContainerDied","Data":"b054640b553db5d47ed5fba418afb28bcd1b57c29698e4d2769dd067599e1573"} Apr 16 20:00:46.722416 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.722139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" event={"ID":"6d52f4e3-bb83-41df-8b11-e82731bb2bd2","Type":"ContainerStarted","Data":"4d193c3e92b4d9e53f475933294c76f3612832e2d581d1613ae765c10a167123"} Apr 16 20:00:46.743416 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.743322 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" podStartSLOduration=1.518882907 podStartE2EDuration="4.743301295s" podCreationTimestamp="2026-04-16 20:00:42 +0000 UTC" firstStartedPulling="2026-04-16 20:00:43.243142182 +0000 UTC m=+407.443852242" lastFinishedPulling="2026-04-16 20:00:46.467560574 +0000 UTC m=+410.668270630" observedRunningTime="2026-04-16 20:00:46.742850533 +0000 UTC m=+410.943560629" watchObservedRunningTime="2026-04-16 20:00:46.743301295 +0000 UTC m=+410.944011374" Apr 16 20:00:46.859597 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.859574 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:46.994373 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.994283 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-util\") pod \"6acd1451-8d5e-424c-a81c-aaa49320f02b\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " Apr 16 20:00:46.994373 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.994336 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rglq9\" (UniqueName: \"kubernetes.io/projected/6acd1451-8d5e-424c-a81c-aaa49320f02b-kube-api-access-rglq9\") pod \"6acd1451-8d5e-424c-a81c-aaa49320f02b\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " Apr 16 20:00:46.994373 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.994363 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-bundle\") pod \"6acd1451-8d5e-424c-a81c-aaa49320f02b\" (UID: \"6acd1451-8d5e-424c-a81c-aaa49320f02b\") " Apr 16 20:00:46.995627 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.995597 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-bundle" (OuterVolumeSpecName: "bundle") pod "6acd1451-8d5e-424c-a81c-aaa49320f02b" (UID: "6acd1451-8d5e-424c-a81c-aaa49320f02b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:00:46.996610 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.996584 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acd1451-8d5e-424c-a81c-aaa49320f02b-kube-api-access-rglq9" (OuterVolumeSpecName: "kube-api-access-rglq9") pod "6acd1451-8d5e-424c-a81c-aaa49320f02b" (UID: "6acd1451-8d5e-424c-a81c-aaa49320f02b"). InnerVolumeSpecName "kube-api-access-rglq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:00:46.999391 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:46.999355 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-util" (OuterVolumeSpecName: "util") pod "6acd1451-8d5e-424c-a81c-aaa49320f02b" (UID: "6acd1451-8d5e-424c-a81c-aaa49320f02b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:00:47.095213 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.095177 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:47.095213 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.095209 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rglq9\" (UniqueName: \"kubernetes.io/projected/6acd1451-8d5e-424c-a81c-aaa49320f02b-kube-api-access-rglq9\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:47.095213 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.095219 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6acd1451-8d5e-424c-a81c-aaa49320f02b-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:00:47.726442 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.726410 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" Apr 16 20:00:47.726811 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.726438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25rbmr" event={"ID":"6acd1451-8d5e-424c-a81c-aaa49320f02b","Type":"ContainerDied","Data":"8e62635d0688eb857665068b78cd570ff8e8a55031168382bba84cb17baa1583"} Apr 16 20:00:47.726811 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.726476 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e62635d0688eb857665068b78cd570ff8e8a55031168382bba84cb17baa1583" Apr 16 20:00:47.726811 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:47.726516 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:00:58.732621 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:00:58.732588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jm679" Apr 16 20:01:05.967478 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967438 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b67ccf97f-82jcv"] Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967735 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="extract" Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967746 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="extract" Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967760 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="pull" Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967766 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="pull" Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967773 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="util" Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967779 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="util" Apr 16 20:01:05.967848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.967837 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6acd1451-8d5e-424c-a81c-aaa49320f02b" containerName="extract" Apr 16 20:01:05.970436 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.970419 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:05.981412 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:05.981387 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b67ccf97f-82jcv"] Apr 16 20:01:06.041778 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-console-config\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.041778 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4j2v\" (UniqueName: \"kubernetes.io/projected/90b10d0d-c518-439e-8261-0358666add06-kube-api-access-q4j2v\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.042000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-trusted-ca-bundle\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.042000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041837 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-oauth-serving-cert\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.042000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041857 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b10d0d-c518-439e-8261-0358666add06-console-oauth-config\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.042000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041880 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-service-ca\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.042000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.041901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b10d0d-c518-439e-8261-0358666add06-console-serving-cert\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-trusted-ca-bundle\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-oauth-serving-cert\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b10d0d-c518-439e-8261-0358666add06-console-oauth-config\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143302 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-service-ca\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b10d0d-c518-439e-8261-0358666add06-console-serving-cert\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-console-config\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.143489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.143393 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4j2v\" (UniqueName: \"kubernetes.io/projected/90b10d0d-c518-439e-8261-0358666add06-kube-api-access-q4j2v\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.144101 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.144070 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-oauth-serving-cert\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.144247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.144151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-trusted-ca-bundle\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.144247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.144208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-console-config\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.144369 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.144288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90b10d0d-c518-439e-8261-0358666add06-service-ca\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.145838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.145814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b10d0d-c518-439e-8261-0358666add06-console-serving-cert\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.145951 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.145859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90b10d0d-c518-439e-8261-0358666add06-console-oauth-config\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.151796 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.151775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4j2v\" (UniqueName: \"kubernetes.io/projected/90b10d0d-c518-439e-8261-0358666add06-kube-api-access-q4j2v\") pod \"console-b67ccf97f-82jcv\" (UID: \"90b10d0d-c518-439e-8261-0358666add06\") " pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.279427 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.279341 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:06.402813 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.402786 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b67ccf97f-82jcv"] Apr 16 20:01:06.404470 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:06.404441 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b10d0d_c518_439e_8261_0358666add06.slice/crio-09bc551f6a4cc0df89ddd827eaca2e55ceea15430fe320d51cc827b5d68e3f0a WatchSource:0}: Error finding container 09bc551f6a4cc0df89ddd827eaca2e55ceea15430fe320d51cc827b5d68e3f0a: Status 404 returned error can't find the container with id 09bc551f6a4cc0df89ddd827eaca2e55ceea15430fe320d51cc827b5d68e3f0a Apr 16 20:01:06.793860 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.793824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b67ccf97f-82jcv" event={"ID":"90b10d0d-c518-439e-8261-0358666add06","Type":"ContainerStarted","Data":"9dab40f7f564b957d1564e7cf3d0e8bba3416d832a4ad972d337b8a457e80eda"} Apr 16 20:01:06.793860 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.793862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b67ccf97f-82jcv" event={"ID":"90b10d0d-c518-439e-8261-0358666add06","Type":"ContainerStarted","Data":"09bc551f6a4cc0df89ddd827eaca2e55ceea15430fe320d51cc827b5d68e3f0a"} Apr 16 20:01:06.814691 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:06.814639 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b67ccf97f-82jcv" podStartSLOduration=1.814624126 podStartE2EDuration="1.814624126s" podCreationTimestamp="2026-04-16 20:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:01:06.812195274 +0000 UTC m=+431.012905353" watchObservedRunningTime="2026-04-16 20:01:06.814624126 +0000 UTC m=+431.015334204" Apr 16 20:01:13.547829 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.547787 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp"] Apr 16 20:01:13.550171 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.550154 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.553027 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.552970 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 20:01:13.553286 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.553266 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 20:01:13.553383 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.553368 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:01:13.553450 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.553389 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-sdrlk\"" Apr 16 20:01:13.553508 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.553496 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:01:13.553995 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.553977 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:01:13.554141 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.553996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 20:01:13.563405 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.563384 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp"] Apr 16 20:01:13.600353 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.600497 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a97f61f-0e17-4d8f-a8df-d01b9668db82-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.600497 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600397 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.600497 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.600608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600518 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.600608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp96l\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-kube-api-access-lp96l\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.600608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.600576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701322 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp96l\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-kube-api-access-lp96l\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701401 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a97f61f-0e17-4d8f-a8df-d01b9668db82-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701677 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.701677 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.701515 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.702220 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.702197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.704002 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.703974 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.704100 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.703978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.704138 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.704126 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a97f61f-0e17-4d8f-a8df-d01b9668db82-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.704216 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.704196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.709224 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.709200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.709615 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.709595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp96l\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-kube-api-access-lp96l\") pod \"istiod-openshift-gateway-7cd77c7ffd-6jpsp\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.860051 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.859927 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:13.987326 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:13.987296 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp"] Apr 16 20:01:13.989247 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:13.989216 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a97f61f_0e17_4d8f_a8df_d01b9668db82.slice/crio-0ed549b24953202411674796674e044cd06241a180b4a586a8df22de4e04cdd1 WatchSource:0}: Error finding container 0ed549b24953202411674796674e044cd06241a180b4a586a8df22de4e04cdd1: Status 404 returned error can't find the container with id 0ed549b24953202411674796674e044cd06241a180b4a586a8df22de4e04cdd1 Apr 16 20:01:14.820983 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:14.820936 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" event={"ID":"8a97f61f-0e17-4d8f-a8df-d01b9668db82","Type":"ContainerStarted","Data":"0ed549b24953202411674796674e044cd06241a180b4a586a8df22de4e04cdd1"} Apr 16 20:01:16.280062 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.280008 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:16.280062 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.280076 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:16.286752 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.286706 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:16.678450 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.678407 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:01:16.678555 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.678485 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:01:16.831298 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.831261 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" event={"ID":"8a97f61f-0e17-4d8f-a8df-d01b9668db82","Type":"ContainerStarted","Data":"a9f414aa0b9afc02855c640204d432a92b7130df82ebc447975f9c6bd5f084ce"} Apr 16 20:01:16.831617 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.831596 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:16.835340 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.835320 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b67ccf97f-82jcv" Apr 16 20:01:16.851387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.851341 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" podStartSLOduration=1.164240222 podStartE2EDuration="3.851328901s" podCreationTimestamp="2026-04-16 20:01:13 +0000 UTC" firstStartedPulling="2026-04-16 20:01:13.991079523 +0000 UTC m=+438.191789582" lastFinishedPulling="2026-04-16 20:01:16.678168196 +0000 UTC m=+440.878878261" observedRunningTime="2026-04-16 20:01:16.84887711 +0000 UTC m=+441.049587188" watchObservedRunningTime="2026-04-16 20:01:16.851328901 +0000 UTC m=+441.052038978" Apr 16 20:01:16.892207 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:16.892178 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fccf94886-pzr8t"] Apr 16 20:01:17.836509 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:17.836474 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:01:27.125657 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.125619 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf"] Apr 16 20:01:27.132510 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.132491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.135271 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.135248 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:01:27.135387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.135248 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6nfj\"" Apr 16 20:01:27.135387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.135252 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:01:27.139002 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.138974 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf"] Apr 16 20:01:27.229307 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.229282 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58"] Apr 16 20:01:27.230298 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.230269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.230440 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.230382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlj6n\" (UniqueName: \"kubernetes.io/projected/24034cf6-536f-4e06-9580-132b555b9d34-kube-api-access-wlj6n\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.230440 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.230426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.231901 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.231886 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.242152 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.242128 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58"] Apr 16 20:01:27.324910 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.324874 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv"] Apr 16 20:01:27.327433 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.327417 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.330985 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.330967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.331093 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331030 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.331093 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331074 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnsvt\" (UniqueName: \"kubernetes.io/projected/3866eb54-328d-49e8-9e3c-467344ebd800-kube-api-access-tnsvt\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.331210 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331191 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.331246 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlj6n\" (UniqueName: \"kubernetes.io/projected/24034cf6-536f-4e06-9580-132b555b9d34-kube-api-access-wlj6n\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.331307 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.331409 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.331596 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.331577 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.337982 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.337957 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv"] Apr 16 20:01:27.340244 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.340217 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlj6n\" (UniqueName: \"kubernetes.io/projected/24034cf6-536f-4e06-9580-132b555b9d34-kube-api-access-wlj6n\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.424133 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.424105 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z"] Apr 16 20:01:27.426369 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.426355 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.431801 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.431769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnsvt\" (UniqueName: \"kubernetes.io/projected/3866eb54-328d-49e8-9e3c-467344ebd800-kube-api-access-tnsvt\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.431934 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.431814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.431934 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.431844 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.431934 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.431870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbqk5\" (UniqueName: \"kubernetes.io/projected/b0aa6378-6c26-4555-a028-9bdcec9c994d-kube-api-access-hbqk5\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.432125 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.431987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.432125 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.432066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.432206 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.432181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.432387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.432366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.436427 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.436407 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z"] Apr 16 20:01:27.440072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.440051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnsvt\" (UniqueName: \"kubernetes.io/projected/3866eb54-328d-49e8-9e3c-467344ebd800-kube-api-access-tnsvt\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.442831 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.442811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:27.532518 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.532668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqk5\" (UniqueName: \"kubernetes.io/projected/b0aa6378-6c26-4555-a028-9bdcec9c994d-kube-api-access-hbqk5\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.532668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.532668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.532668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4tc\" (UniqueName: \"kubernetes.io/projected/5f49d530-7e36-4c37-927b-c791d385919d-kube-api-access-ss4tc\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.532879 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.532983 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532962 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.533062 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.532966 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.541084 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.541057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqk5\" (UniqueName: \"kubernetes.io/projected/b0aa6378-6c26-4555-a028-9bdcec9c994d-kube-api-access-hbqk5\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.541756 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.541739 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:27.568091 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.568061 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf"] Apr 16 20:01:27.569471 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:27.569442 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24034cf6_536f_4e06_9580_132b555b9d34.slice/crio-388eaa002e885e6b36d1f09547a29f026d83fe678a69b4c6a8ce19489899384b WatchSource:0}: Error finding container 388eaa002e885e6b36d1f09547a29f026d83fe678a69b4c6a8ce19489899384b: Status 404 returned error can't find the container with id 388eaa002e885e6b36d1f09547a29f026d83fe678a69b4c6a8ce19489899384b Apr 16 20:01:27.634916 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.634004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.634916 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.634139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4tc\" (UniqueName: \"kubernetes.io/projected/5f49d530-7e36-4c37-927b-c791d385919d-kube-api-access-ss4tc\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.634916 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.634183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.634916 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.634605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.634916 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.634854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.636070 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.636048 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:27.642920 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.642880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4tc\" (UniqueName: \"kubernetes.io/projected/5f49d530-7e36-4c37-927b-c791d385919d-kube-api-access-ss4tc\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.677681 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.677645 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58"] Apr 16 20:01:27.680087 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:27.680055 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3866eb54_328d_49e8_9e3c_467344ebd800.slice/crio-7a6314b4599e902e88263781fdab8c0f12841f0c4147a8008bfcd8805d3a3457 WatchSource:0}: Error finding container 7a6314b4599e902e88263781fdab8c0f12841f0c4147a8008bfcd8805d3a3457: Status 404 returned error can't find the container with id 7a6314b4599e902e88263781fdab8c0f12841f0c4147a8008bfcd8805d3a3457 Apr 16 20:01:27.735953 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.735925 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:27.770746 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.770718 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv"] Apr 16 20:01:27.772474 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:27.772445 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0aa6378_6c26_4555_a028_9bdcec9c994d.slice/crio-dc7dc8788cf89fd416fcd49782f4a6a5ad507a6e929bcb01ef60ebe2c04102f7 WatchSource:0}: Error finding container dc7dc8788cf89fd416fcd49782f4a6a5ad507a6e929bcb01ef60ebe2c04102f7: Status 404 returned error can't find the container with id dc7dc8788cf89fd416fcd49782f4a6a5ad507a6e929bcb01ef60ebe2c04102f7 Apr 16 20:01:27.873899 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.873855 2569 generic.go:358] "Generic (PLEG): container finished" podID="3866eb54-328d-49e8-9e3c-467344ebd800" containerID="0fdae9d037068b56391f46861e6b4df240cbb4932ca28e3c527f16d0501b5669" exitCode=0 Apr 16 20:01:27.874144 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.874059 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" event={"ID":"3866eb54-328d-49e8-9e3c-467344ebd800","Type":"ContainerDied","Data":"0fdae9d037068b56391f46861e6b4df240cbb4932ca28e3c527f16d0501b5669"} Apr 16 20:01:27.874144 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.874109 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" event={"ID":"3866eb54-328d-49e8-9e3c-467344ebd800","Type":"ContainerStarted","Data":"7a6314b4599e902e88263781fdab8c0f12841f0c4147a8008bfcd8805d3a3457"} Apr 16 20:01:27.875566 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.875531 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z"] Apr 16 20:01:27.876763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.876707 2569 generic.go:358] "Generic (PLEG): container finished" podID="24034cf6-536f-4e06-9580-132b555b9d34" containerID="d46d10ce29310c64b731775d1c7d0e7c38360f93181cfcc679d78714d1fabd44" exitCode=0 Apr 16 20:01:27.876865 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.876789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" event={"ID":"24034cf6-536f-4e06-9580-132b555b9d34","Type":"ContainerDied","Data":"d46d10ce29310c64b731775d1c7d0e7c38360f93181cfcc679d78714d1fabd44"} Apr 16 20:01:27.876865 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.876814 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" event={"ID":"24034cf6-536f-4e06-9580-132b555b9d34","Type":"ContainerStarted","Data":"388eaa002e885e6b36d1f09547a29f026d83fe678a69b4c6a8ce19489899384b"} Apr 16 20:01:27.880507 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.880473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" event={"ID":"b0aa6378-6c26-4555-a028-9bdcec9c994d","Type":"ContainerStarted","Data":"8d37f45cf88e5e9f82712866beb1a66808e012d48d7fe1ea34609ee0bc9cf09f"} Apr 16 20:01:27.880507 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:27.880502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" event={"ID":"b0aa6378-6c26-4555-a028-9bdcec9c994d","Type":"ContainerStarted","Data":"dc7dc8788cf89fd416fcd49782f4a6a5ad507a6e929bcb01ef60ebe2c04102f7"} Apr 16 20:01:27.930059 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:27.929953 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f49d530_7e36_4c37_927b_c791d385919d.slice/crio-723dd9f929b071cfdd80555770a53ae8a721b46525f772b11fcffb91b8fe8f7a WatchSource:0}: Error finding container 723dd9f929b071cfdd80555770a53ae8a721b46525f772b11fcffb91b8fe8f7a: Status 404 returned error can't find the container with id 723dd9f929b071cfdd80555770a53ae8a721b46525f772b11fcffb91b8fe8f7a Apr 16 20:01:28.886206 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.886126 2569 generic.go:358] "Generic (PLEG): container finished" podID="3866eb54-328d-49e8-9e3c-467344ebd800" containerID="1fa2cb0903e4d2446054dfb04f2e5327fb9b75a40b5b2f6e6fcd0b12bb11f6ef" exitCode=0 Apr 16 20:01:28.886590 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.886213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" event={"ID":"3866eb54-328d-49e8-9e3c-467344ebd800","Type":"ContainerDied","Data":"1fa2cb0903e4d2446054dfb04f2e5327fb9b75a40b5b2f6e6fcd0b12bb11f6ef"} Apr 16 20:01:28.887618 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.887590 2569 generic.go:358] "Generic (PLEG): container finished" podID="5f49d530-7e36-4c37-927b-c791d385919d" containerID="30b14e604b217ef8bb588ca850d6ed92afcc3f5369644e13122f839ad648223e" exitCode=0 Apr 16 20:01:28.887760 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.887671 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" event={"ID":"5f49d530-7e36-4c37-927b-c791d385919d","Type":"ContainerDied","Data":"30b14e604b217ef8bb588ca850d6ed92afcc3f5369644e13122f839ad648223e"} Apr 16 20:01:28.887760 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.887709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" event={"ID":"5f49d530-7e36-4c37-927b-c791d385919d","Type":"ContainerStarted","Data":"723dd9f929b071cfdd80555770a53ae8a721b46525f772b11fcffb91b8fe8f7a"} Apr 16 20:01:28.889508 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.889487 2569 generic.go:358] "Generic (PLEG): container finished" podID="24034cf6-536f-4e06-9580-132b555b9d34" containerID="bb18f0f4225a860b8989cbd09b45a53226eec93c11136c3ddd5cb1c71f2ed2ee" exitCode=0 Apr 16 20:01:28.889620 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.889548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" event={"ID":"24034cf6-536f-4e06-9580-132b555b9d34","Type":"ContainerDied","Data":"bb18f0f4225a860b8989cbd09b45a53226eec93c11136c3ddd5cb1c71f2ed2ee"} Apr 16 20:01:28.891173 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.891143 2569 generic.go:358] "Generic (PLEG): container finished" podID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerID="8d37f45cf88e5e9f82712866beb1a66808e012d48d7fe1ea34609ee0bc9cf09f" exitCode=0 Apr 16 20:01:28.891245 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:28.891172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" event={"ID":"b0aa6378-6c26-4555-a028-9bdcec9c994d","Type":"ContainerDied","Data":"8d37f45cf88e5e9f82712866beb1a66808e012d48d7fe1ea34609ee0bc9cf09f"} Apr 16 20:01:29.896433 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.896347 2569 generic.go:358] "Generic (PLEG): container finished" podID="5f49d530-7e36-4c37-927b-c791d385919d" containerID="428818294b3ece23019f2a1826d8ab3002f4a907ffa3d5e3244d195f24ac9070" exitCode=0 Apr 16 20:01:29.896840 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.896435 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" event={"ID":"5f49d530-7e36-4c37-927b-c791d385919d","Type":"ContainerDied","Data":"428818294b3ece23019f2a1826d8ab3002f4a907ffa3d5e3244d195f24ac9070"} Apr 16 20:01:29.898452 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.898424 2569 generic.go:358] "Generic (PLEG): container finished" podID="24034cf6-536f-4e06-9580-132b555b9d34" containerID="8d50779431eaf2cd76e99a2f3a959005bbf4045512a6c6a35de831f62e50a723" exitCode=0 Apr 16 20:01:29.898546 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.898503 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" event={"ID":"24034cf6-536f-4e06-9580-132b555b9d34","Type":"ContainerDied","Data":"8d50779431eaf2cd76e99a2f3a959005bbf4045512a6c6a35de831f62e50a723"} Apr 16 20:01:29.900027 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.899994 2569 generic.go:358] "Generic (PLEG): container finished" podID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerID="8bd687e27d5086310bc610aa702d1728d665d941921206435745108bc50d5f0c" exitCode=0 Apr 16 20:01:29.900104 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.900037 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" event={"ID":"b0aa6378-6c26-4555-a028-9bdcec9c994d","Type":"ContainerDied","Data":"8bd687e27d5086310bc610aa702d1728d665d941921206435745108bc50d5f0c"} Apr 16 20:01:29.902000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.901980 2569 generic.go:358] "Generic (PLEG): container finished" podID="3866eb54-328d-49e8-9e3c-467344ebd800" containerID="8eb32eeb00a981f374dbfe9dcbf2830ee4e68e6d0bca05b2e8b31d2e90a20e32" exitCode=0 Apr 16 20:01:29.902128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:29.902044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" event={"ID":"3866eb54-328d-49e8-9e3c-467344ebd800","Type":"ContainerDied","Data":"8eb32eeb00a981f374dbfe9dcbf2830ee4e68e6d0bca05b2e8b31d2e90a20e32"} Apr 16 20:01:30.909425 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:30.909390 2569 generic.go:358] "Generic (PLEG): container finished" podID="5f49d530-7e36-4c37-927b-c791d385919d" containerID="f7de02edb5dfe36d92d166120549c71c7ea626bdde5e37c69c9a20df7e1d47ff" exitCode=0 Apr 16 20:01:30.909836 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:30.909470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" event={"ID":"5f49d530-7e36-4c37-927b-c791d385919d","Type":"ContainerDied","Data":"f7de02edb5dfe36d92d166120549c71c7ea626bdde5e37c69c9a20df7e1d47ff"} Apr 16 20:01:30.911408 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:30.911384 2569 generic.go:358] "Generic (PLEG): container finished" podID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerID="ed071615505dd57ead97a07a56748a9297c58f63926d9de8890754ea2e6f802f" exitCode=0 Apr 16 20:01:30.911536 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:30.911491 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" event={"ID":"b0aa6378-6c26-4555-a028-9bdcec9c994d","Type":"ContainerDied","Data":"ed071615505dd57ead97a07a56748a9297c58f63926d9de8890754ea2e6f802f"} Apr 16 20:01:31.038322 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.038299 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:31.068693 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.068669 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:31.164389 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.164299 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-bundle\") pod \"3866eb54-328d-49e8-9e3c-467344ebd800\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " Apr 16 20:01:31.164389 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.164366 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnsvt\" (UniqueName: \"kubernetes.io/projected/3866eb54-328d-49e8-9e3c-467344ebd800-kube-api-access-tnsvt\") pod \"3866eb54-328d-49e8-9e3c-467344ebd800\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " Apr 16 20:01:31.164678 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.164401 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-util\") pod \"24034cf6-536f-4e06-9580-132b555b9d34\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " Apr 16 20:01:31.164678 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.164433 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlj6n\" (UniqueName: \"kubernetes.io/projected/24034cf6-536f-4e06-9580-132b555b9d34-kube-api-access-wlj6n\") pod \"24034cf6-536f-4e06-9580-132b555b9d34\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " Apr 16 20:01:31.164678 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.164498 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-util\") pod \"3866eb54-328d-49e8-9e3c-467344ebd800\" (UID: \"3866eb54-328d-49e8-9e3c-467344ebd800\") " Apr 16 20:01:31.164678 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.164542 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-bundle\") pod \"24034cf6-536f-4e06-9580-132b555b9d34\" (UID: \"24034cf6-536f-4e06-9580-132b555b9d34\") " Apr 16 20:01:31.165151 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.165115 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-bundle" (OuterVolumeSpecName: "bundle") pod "24034cf6-536f-4e06-9580-132b555b9d34" (UID: "24034cf6-536f-4e06-9580-132b555b9d34"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:31.165222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.165131 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-bundle" (OuterVolumeSpecName: "bundle") pod "3866eb54-328d-49e8-9e3c-467344ebd800" (UID: "3866eb54-328d-49e8-9e3c-467344ebd800"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:31.166652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.166628 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24034cf6-536f-4e06-9580-132b555b9d34-kube-api-access-wlj6n" (OuterVolumeSpecName: "kube-api-access-wlj6n") pod "24034cf6-536f-4e06-9580-132b555b9d34" (UID: "24034cf6-536f-4e06-9580-132b555b9d34"). InnerVolumeSpecName "kube-api-access-wlj6n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:31.166921 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.166902 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3866eb54-328d-49e8-9e3c-467344ebd800-kube-api-access-tnsvt" (OuterVolumeSpecName: "kube-api-access-tnsvt") pod "3866eb54-328d-49e8-9e3c-467344ebd800" (UID: "3866eb54-328d-49e8-9e3c-467344ebd800"). InnerVolumeSpecName "kube-api-access-tnsvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:31.170252 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.170211 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-util" (OuterVolumeSpecName: "util") pod "3866eb54-328d-49e8-9e3c-467344ebd800" (UID: "3866eb54-328d-49e8-9e3c-467344ebd800"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:31.170875 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.170854 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-util" (OuterVolumeSpecName: "util") pod "24034cf6-536f-4e06-9580-132b555b9d34" (UID: "24034cf6-536f-4e06-9580-132b555b9d34"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:31.265951 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.265907 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:31.265951 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.265945 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:31.266196 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.265960 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnsvt\" (UniqueName: \"kubernetes.io/projected/3866eb54-328d-49e8-9e3c-467344ebd800-kube-api-access-tnsvt\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:31.266196 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.265973 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24034cf6-536f-4e06-9580-132b555b9d34-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:31.266196 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.265986 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wlj6n\" (UniqueName: \"kubernetes.io/projected/24034cf6-536f-4e06-9580-132b555b9d34-kube-api-access-wlj6n\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:31.266196 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.265997 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3866eb54-328d-49e8-9e3c-467344ebd800-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:31.917501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.917471 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" Apr 16 20:01:31.917501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.917479 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503l4x58" event={"ID":"3866eb54-328d-49e8-9e3c-467344ebd800","Type":"ContainerDied","Data":"7a6314b4599e902e88263781fdab8c0f12841f0c4147a8008bfcd8805d3a3457"} Apr 16 20:01:31.917501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.917511 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a6314b4599e902e88263781fdab8c0f12841f0c4147a8008bfcd8805d3a3457" Apr 16 20:01:31.919310 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.919282 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" Apr 16 20:01:31.919310 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.919279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bc8dmf" event={"ID":"24034cf6-536f-4e06-9580-132b555b9d34","Type":"ContainerDied","Data":"388eaa002e885e6b36d1f09547a29f026d83fe678a69b4c6a8ce19489899384b"} Apr 16 20:01:31.919527 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:31.919325 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388eaa002e885e6b36d1f09547a29f026d83fe678a69b4c6a8ce19489899384b" Apr 16 20:01:32.056679 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.056657 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:32.101051 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.101005 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:32.184649 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.184560 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-util\") pod \"5f49d530-7e36-4c37-927b-c791d385919d\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " Apr 16 20:01:32.184649 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.184623 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4tc\" (UniqueName: \"kubernetes.io/projected/5f49d530-7e36-4c37-927b-c791d385919d-kube-api-access-ss4tc\") pod \"5f49d530-7e36-4c37-927b-c791d385919d\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " Apr 16 20:01:32.184869 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.184657 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-bundle\") pod \"b0aa6378-6c26-4555-a028-9bdcec9c994d\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " Apr 16 20:01:32.184869 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.184692 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-bundle\") pod \"5f49d530-7e36-4c37-927b-c791d385919d\" (UID: \"5f49d530-7e36-4c37-927b-c791d385919d\") " Apr 16 20:01:32.184869 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.184725 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-util\") pod \"b0aa6378-6c26-4555-a028-9bdcec9c994d\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " Apr 16 20:01:32.184869 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.184765 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbqk5\" (UniqueName: \"kubernetes.io/projected/b0aa6378-6c26-4555-a028-9bdcec9c994d-kube-api-access-hbqk5\") pod \"b0aa6378-6c26-4555-a028-9bdcec9c994d\" (UID: \"b0aa6378-6c26-4555-a028-9bdcec9c994d\") " Apr 16 20:01:32.185252 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.185217 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-bundle" (OuterVolumeSpecName: "bundle") pod "b0aa6378-6c26-4555-a028-9bdcec9c994d" (UID: "b0aa6378-6c26-4555-a028-9bdcec9c994d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:32.185332 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.185256 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-bundle" (OuterVolumeSpecName: "bundle") pod "5f49d530-7e36-4c37-927b-c791d385919d" (UID: "5f49d530-7e36-4c37-927b-c791d385919d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:32.186868 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.186842 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0aa6378-6c26-4555-a028-9bdcec9c994d-kube-api-access-hbqk5" (OuterVolumeSpecName: "kube-api-access-hbqk5") pod "b0aa6378-6c26-4555-a028-9bdcec9c994d" (UID: "b0aa6378-6c26-4555-a028-9bdcec9c994d"). InnerVolumeSpecName "kube-api-access-hbqk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:32.186868 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.186865 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f49d530-7e36-4c37-927b-c791d385919d-kube-api-access-ss4tc" (OuterVolumeSpecName: "kube-api-access-ss4tc") pod "5f49d530-7e36-4c37-927b-c791d385919d" (UID: "5f49d530-7e36-4c37-927b-c791d385919d"). InnerVolumeSpecName "kube-api-access-ss4tc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:32.190557 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.190534 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-util" (OuterVolumeSpecName: "util") pod "5f49d530-7e36-4c37-927b-c791d385919d" (UID: "5f49d530-7e36-4c37-927b-c791d385919d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:32.190653 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.190551 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-util" (OuterVolumeSpecName: "util") pod "b0aa6378-6c26-4555-a028-9bdcec9c994d" (UID: "b0aa6378-6c26-4555-a028-9bdcec9c994d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:32.286032 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.285976 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:32.286032 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.286005 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss4tc\" (UniqueName: \"kubernetes.io/projected/5f49d530-7e36-4c37-927b-c791d385919d-kube-api-access-ss4tc\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:32.286032 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.286035 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:32.286238 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.286044 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f49d530-7e36-4c37-927b-c791d385919d-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:32.286238 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.286053 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0aa6378-6c26-4555-a028-9bdcec9c994d-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:32.286238 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.286062 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hbqk5\" (UniqueName: \"kubernetes.io/projected/b0aa6378-6c26-4555-a028-9bdcec9c994d-kube-api-access-hbqk5\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:32.924616 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.924585 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" Apr 16 20:01:32.925062 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.924599 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306r66z" event={"ID":"5f49d530-7e36-4c37-927b-c791d385919d","Type":"ContainerDied","Data":"723dd9f929b071cfdd80555770a53ae8a721b46525f772b11fcffb91b8fe8f7a"} Apr 16 20:01:32.925062 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.924647 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723dd9f929b071cfdd80555770a53ae8a721b46525f772b11fcffb91b8fe8f7a" Apr 16 20:01:32.926415 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.926393 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" Apr 16 20:01:32.926415 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.926403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88zkptv" event={"ID":"b0aa6378-6c26-4555-a028-9bdcec9c994d","Type":"ContainerDied","Data":"dc7dc8788cf89fd416fcd49782f4a6a5ad507a6e929bcb01ef60ebe2c04102f7"} Apr 16 20:01:32.926577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:32.926436 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc7dc8788cf89fd416fcd49782f4a6a5ad507a6e929bcb01ef60ebe2c04102f7" Apr 16 20:01:41.319535 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319498 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq"] Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319790 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="extract" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319800 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="extract" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319812 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="util" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319818 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="util" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319826 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="pull" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319831 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="pull" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319839 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="extract" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319844 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="extract" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319852 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="extract" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319857 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="extract" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319866 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="util" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319871 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="util" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319876 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="pull" Apr 16 20:01:41.319874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319882 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="pull" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319887 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="extract" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319892 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="extract" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319899 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="util" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319904 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="util" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319909 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="pull" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319913 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="pull" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319920 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="pull" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319925 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="pull" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319934 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="util" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319939 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="util" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319980 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3866eb54-328d-49e8-9e3c-467344ebd800" containerName="extract" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319989 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0aa6378-6c26-4555-a028-9bdcec9c994d" containerName="extract" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.319996 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="24034cf6-536f-4e06-9580-132b555b9d34" containerName="extract" Apr 16 20:01:41.320303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.320004 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f49d530-7e36-4c37-927b-c791d385919d" containerName="extract" Apr 16 20:01:41.324187 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.324168 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:41.326947 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.326921 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-zltx5\"" Apr 16 20:01:41.327065 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.326959 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:01:41.327724 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.327708 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:01:41.337147 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.337123 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq"] Apr 16 20:01:41.361603 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.361574 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xdz8\" (UniqueName: \"kubernetes.io/projected/91b14a1d-981f-4f41-a141-c3f4f8da74e8-kube-api-access-2xdz8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-l27sq\" (UID: \"91b14a1d-981f-4f41-a141-c3f4f8da74e8\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:41.462969 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.462927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xdz8\" (UniqueName: \"kubernetes.io/projected/91b14a1d-981f-4f41-a141-c3f4f8da74e8-kube-api-access-2xdz8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-l27sq\" (UID: \"91b14a1d-981f-4f41-a141-c3f4f8da74e8\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:41.474874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.474839 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xdz8\" (UniqueName: \"kubernetes.io/projected/91b14a1d-981f-4f41-a141-c3f4f8da74e8-kube-api-access-2xdz8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-l27sq\" (UID: \"91b14a1d-981f-4f41-a141-c3f4f8da74e8\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:41.634185 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.634109 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:41.779329 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.779304 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq"] Apr 16 20:01:41.781252 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:41.781222 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b14a1d_981f_4f41_a141_c3f4f8da74e8.slice/crio-12cc7c2c591cf042691cf695d47f2d180f7db98a955c4bbcdccd3b58c2b6d844 WatchSource:0}: Error finding container 12cc7c2c591cf042691cf695d47f2d180f7db98a955c4bbcdccd3b58c2b6d844: Status 404 returned error can't find the container with id 12cc7c2c591cf042691cf695d47f2d180f7db98a955c4bbcdccd3b58c2b6d844 Apr 16 20:01:41.918439 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.918382 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fccf94886-pzr8t" podUID="52c662a5-de48-45c3-8f35-8217c2ccacd3" containerName="console" containerID="cri-o://6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1" gracePeriod=15 Apr 16 20:01:41.960917 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:41.960883 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" event={"ID":"91b14a1d-981f-4f41-a141-c3f4f8da74e8","Type":"ContainerStarted","Data":"12cc7c2c591cf042691cf695d47f2d180f7db98a955c4bbcdccd3b58c2b6d844"} Apr 16 20:01:42.143921 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.143899 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fccf94886-pzr8t_52c662a5-de48-45c3-8f35-8217c2ccacd3/console/0.log" Apr 16 20:01:42.144088 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.143960 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 20:01:42.169007 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.168945 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-serving-cert\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169143 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169045 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-service-ca\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169143 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169074 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-trusted-ca-bundle\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169143 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169103 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-oauth-config\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169304 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169230 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-oauth-serving-cert\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169304 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169275 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-config\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169399 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169311 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvbjr\" (UniqueName: \"kubernetes.io/projected/52c662a5-de48-45c3-8f35-8217c2ccacd3-kube-api-access-gvbjr\") pod \"52c662a5-de48-45c3-8f35-8217c2ccacd3\" (UID: \"52c662a5-de48-45c3-8f35-8217c2ccacd3\") " Apr 16 20:01:42.169534 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169504 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:01:42.169534 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169512 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-service-ca" (OuterVolumeSpecName: "service-ca") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:01:42.169667 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169561 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:01:42.169860 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.169776 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-config" (OuterVolumeSpecName: "console-config") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:01:42.171684 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.171658 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:01:42.172110 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.172041 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:01:42.172207 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.172125 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c662a5-de48-45c3-8f35-8217c2ccacd3-kube-api-access-gvbjr" (OuterVolumeSpecName: "kube-api-access-gvbjr") pod "52c662a5-de48-45c3-8f35-8217c2ccacd3" (UID: "52c662a5-de48-45c3-8f35-8217c2ccacd3"). InnerVolumeSpecName "kube-api-access-gvbjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:42.269898 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269871 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-service-ca\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.269898 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269895 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-trusted-ca-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.270072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269905 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-oauth-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.270072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269914 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-oauth-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.270072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269925 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.270072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269934 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvbjr\" (UniqueName: \"kubernetes.io/projected/52c662a5-de48-45c3-8f35-8217c2ccacd3-kube-api-access-gvbjr\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.270072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.269944 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c662a5-de48-45c3-8f35-8217c2ccacd3-console-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:01:42.965297 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.965271 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fccf94886-pzr8t_52c662a5-de48-45c3-8f35-8217c2ccacd3/console/0.log" Apr 16 20:01:42.965757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.965312 2569 generic.go:358] "Generic (PLEG): container finished" podID="52c662a5-de48-45c3-8f35-8217c2ccacd3" containerID="6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1" exitCode=2 Apr 16 20:01:42.965757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.965351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fccf94886-pzr8t" event={"ID":"52c662a5-de48-45c3-8f35-8217c2ccacd3","Type":"ContainerDied","Data":"6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1"} Apr 16 20:01:42.965757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.965384 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fccf94886-pzr8t" Apr 16 20:01:42.965757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.965403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fccf94886-pzr8t" event={"ID":"52c662a5-de48-45c3-8f35-8217c2ccacd3","Type":"ContainerDied","Data":"6502e8ee76bdeea6908ebeb22313a9a947b7cfb1ff3a0338d356e4103d454643"} Apr 16 20:01:42.965757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.965425 2569 scope.go:117] "RemoveContainer" containerID="6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1" Apr 16 20:01:42.974595 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.974534 2569 scope.go:117] "RemoveContainer" containerID="6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1" Apr 16 20:01:42.974858 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:01:42.974833 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1\": container with ID starting with 6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1 not found: ID does not exist" containerID="6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1" Apr 16 20:01:42.974939 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.974891 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1"} err="failed to get container status \"6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1\": rpc error: code = NotFound desc = could not find container \"6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1\": container with ID starting with 6942ec1fe86271618280759d5ac60f94b77c3081d62b8f8d415bc62478a07ba1 not found: ID does not exist" Apr 16 20:01:42.994592 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:42.994555 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fccf94886-pzr8t"] Apr 16 20:01:43.003633 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:43.003610 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fccf94886-pzr8t"] Apr 16 20:01:44.303396 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.303359 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-x9tfj"] Apr 16 20:01:44.303809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.303722 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52c662a5-de48-45c3-8f35-8217c2ccacd3" containerName="console" Apr 16 20:01:44.303809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.303739 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c662a5-de48-45c3-8f35-8217c2ccacd3" containerName="console" Apr 16 20:01:44.303809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.303803 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="52c662a5-de48-45c3-8f35-8217c2ccacd3" containerName="console" Apr 16 20:01:44.307150 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.307123 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:01:44.309609 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.309588 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-86sb8\"" Apr 16 20:01:44.319282 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.319257 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-x9tfj"] Apr 16 20:01:44.378461 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.378428 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c662a5-de48-45c3-8f35-8217c2ccacd3" path="/var/lib/kubelet/pods/52c662a5-de48-45c3-8f35-8217c2ccacd3/volumes" Apr 16 20:01:44.390226 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.390195 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdbpq\" (UniqueName: \"kubernetes.io/projected/7367c59c-39ce-4fba-aca1-183d27d4e066-kube-api-access-fdbpq\") pod \"authorino-operator-7587b89b76-x9tfj\" (UID: \"7367c59c-39ce-4fba-aca1-183d27d4e066\") " pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:01:44.491340 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.491295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdbpq\" (UniqueName: \"kubernetes.io/projected/7367c59c-39ce-4fba-aca1-183d27d4e066-kube-api-access-fdbpq\") pod \"authorino-operator-7587b89b76-x9tfj\" (UID: \"7367c59c-39ce-4fba-aca1-183d27d4e066\") " pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:01:44.502695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.502668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdbpq\" (UniqueName: \"kubernetes.io/projected/7367c59c-39ce-4fba-aca1-183d27d4e066-kube-api-access-fdbpq\") pod \"authorino-operator-7587b89b76-x9tfj\" (UID: \"7367c59c-39ce-4fba-aca1-183d27d4e066\") " pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:01:44.620430 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.620405 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:01:44.760876 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.760847 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-x9tfj"] Apr 16 20:01:44.763297 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:01:44.763252 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7367c59c_39ce_4fba_aca1_183d27d4e066.slice/crio-9d8c2f2c34ebb14cb36c15d82488964acc27086a4b6f79b8eebbb5fa67f6f542 WatchSource:0}: Error finding container 9d8c2f2c34ebb14cb36c15d82488964acc27086a4b6f79b8eebbb5fa67f6f542: Status 404 returned error can't find the container with id 9d8c2f2c34ebb14cb36c15d82488964acc27086a4b6f79b8eebbb5fa67f6f542 Apr 16 20:01:44.975033 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.974975 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" event={"ID":"91b14a1d-981f-4f41-a141-c3f4f8da74e8","Type":"ContainerStarted","Data":"3934e3ca93220bd11f78da74c6b66725ff1ed57d873be69690781364d2ea6480"} Apr 16 20:01:44.975246 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.975065 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:44.976192 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:44.976165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" event={"ID":"7367c59c-39ce-4fba-aca1-183d27d4e066","Type":"ContainerStarted","Data":"9d8c2f2c34ebb14cb36c15d82488964acc27086a4b6f79b8eebbb5fa67f6f542"} Apr 16 20:01:45.000200 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:45.000154 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" podStartSLOduration=1.189524593 podStartE2EDuration="4.000141918s" podCreationTimestamp="2026-04-16 20:01:41 +0000 UTC" firstStartedPulling="2026-04-16 20:01:41.783738801 +0000 UTC m=+465.984448871" lastFinishedPulling="2026-04-16 20:01:44.594356125 +0000 UTC m=+468.795066196" observedRunningTime="2026-04-16 20:01:44.998518833 +0000 UTC m=+469.199228916" watchObservedRunningTime="2026-04-16 20:01:45.000141918 +0000 UTC m=+469.200851997" Apr 16 20:01:46.985244 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:46.985203 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" event={"ID":"7367c59c-39ce-4fba-aca1-183d27d4e066","Type":"ContainerStarted","Data":"687f4471809117522dcc7e7ebd5a6d800b9d7249264e378cde337d0c23a76a9a"} Apr 16 20:01:46.985680 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:46.985298 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:01:47.001773 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:47.001727 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" podStartSLOduration=1.430988398 podStartE2EDuration="3.001714056s" podCreationTimestamp="2026-04-16 20:01:44 +0000 UTC" firstStartedPulling="2026-04-16 20:01:44.765653178 +0000 UTC m=+468.966363234" lastFinishedPulling="2026-04-16 20:01:46.336378836 +0000 UTC m=+470.537088892" observedRunningTime="2026-04-16 20:01:47.000897123 +0000 UTC m=+471.201607221" watchObservedRunningTime="2026-04-16 20:01:47.001714056 +0000 UTC m=+471.202424134" Apr 16 20:01:55.982137 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:55.982103 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l27sq" Apr 16 20:01:57.991788 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:01:57.991754 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-x9tfj" Apr 16 20:02:27.495933 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.495900 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-b4bhk"] Apr 16 20:02:27.498209 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.498193 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.504588 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.504559 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:02:27.505130 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.504640 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qrnwl\"" Apr 16 20:02:27.511384 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.511361 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-b4bhk"] Apr 16 20:02:27.540446 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.540422 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-b4bhk"] Apr 16 20:02:27.650387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.650357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/db90142b-95ef-4184-821c-f3378d1c51cb-config-file\") pod \"limitador-limitador-67566c68b4-b4bhk\" (UID: \"db90142b-95ef-4184-821c-f3378d1c51cb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.650387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.650391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-596j4\" (UniqueName: \"kubernetes.io/projected/db90142b-95ef-4184-821c-f3378d1c51cb-kube-api-access-596j4\") pod \"limitador-limitador-67566c68b4-b4bhk\" (UID: \"db90142b-95ef-4184-821c-f3378d1c51cb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.751652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.751562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/db90142b-95ef-4184-821c-f3378d1c51cb-config-file\") pod \"limitador-limitador-67566c68b4-b4bhk\" (UID: \"db90142b-95ef-4184-821c-f3378d1c51cb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.751652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.751611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-596j4\" (UniqueName: \"kubernetes.io/projected/db90142b-95ef-4184-821c-f3378d1c51cb-kube-api-access-596j4\") pod \"limitador-limitador-67566c68b4-b4bhk\" (UID: \"db90142b-95ef-4184-821c-f3378d1c51cb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.752279 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.752255 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/db90142b-95ef-4184-821c-f3378d1c51cb-config-file\") pod \"limitador-limitador-67566c68b4-b4bhk\" (UID: \"db90142b-95ef-4184-821c-f3378d1c51cb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.760459 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.760439 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-596j4\" (UniqueName: \"kubernetes.io/projected/db90142b-95ef-4184-821c-f3378d1c51cb-kube-api-access-596j4\") pod \"limitador-limitador-67566c68b4-b4bhk\" (UID: \"db90142b-95ef-4184-821c-f3378d1c51cb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.808910 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.808884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:27.934212 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:27.934186 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-b4bhk"] Apr 16 20:02:27.935607 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:02:27.935579 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb90142b_95ef_4184_821c_f3378d1c51cb.slice/crio-fe580a853a1d4c9ef69ae63e4f0cb867e251e06d8c8490fcd9d864a6a3f2b85a WatchSource:0}: Error finding container fe580a853a1d4c9ef69ae63e4f0cb867e251e06d8c8490fcd9d864a6a3f2b85a: Status 404 returned error can't find the container with id fe580a853a1d4c9ef69ae63e4f0cb867e251e06d8c8490fcd9d864a6a3f2b85a Apr 16 20:02:28.138692 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:28.138603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" event={"ID":"db90142b-95ef-4184-821c-f3378d1c51cb","Type":"ContainerStarted","Data":"fe580a853a1d4c9ef69ae63e4f0cb867e251e06d8c8490fcd9d864a6a3f2b85a"} Apr 16 20:02:32.156900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:32.156858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" event={"ID":"db90142b-95ef-4184-821c-f3378d1c51cb","Type":"ContainerStarted","Data":"077e5c94c2d144df6d1649f93f9d65d19c54b1be6095b85ca230def8a0209571"} Apr 16 20:02:32.157291 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:32.156980 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:02:32.173779 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:32.173727 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" podStartSLOduration=1.497061387 podStartE2EDuration="5.17371151s" podCreationTimestamp="2026-04-16 20:02:27 +0000 UTC" firstStartedPulling="2026-04-16 20:02:27.937789691 +0000 UTC m=+512.138499751" lastFinishedPulling="2026-04-16 20:02:31.614439805 +0000 UTC m=+515.815149874" observedRunningTime="2026-04-16 20:02:32.172327251 +0000 UTC m=+516.373037331" watchObservedRunningTime="2026-04-16 20:02:32.17371151 +0000 UTC m=+516.374421594" Apr 16 20:02:43.162235 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:02:43.162200 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-b4bhk" Apr 16 20:03:08.109797 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.109707 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp"] Apr 16 20:03:08.110533 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.110497 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" podUID="8a97f61f-0e17-4d8f-a8df-d01b9668db82" containerName="discovery" containerID="cri-o://a9f414aa0b9afc02855c640204d432a92b7130df82ebc447975f9c6bd5f084ce" gracePeriod=30 Apr 16 20:03:08.293229 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.293200 2569 generic.go:358] "Generic (PLEG): container finished" podID="8a97f61f-0e17-4d8f-a8df-d01b9668db82" containerID="a9f414aa0b9afc02855c640204d432a92b7130df82ebc447975f9c6bd5f084ce" exitCode=0 Apr 16 20:03:08.293367 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.293276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" event={"ID":"8a97f61f-0e17-4d8f-a8df-d01b9668db82","Type":"ContainerDied","Data":"a9f414aa0b9afc02855c640204d432a92b7130df82ebc447975f9c6bd5f084ce"} Apr 16 20:03:08.372765 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.372744 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:03:08.489160 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489131 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-cacerts\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.489321 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489172 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-token\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.489321 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489197 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp96l\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-kube-api-access-lp96l\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.489414 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489389 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-kubeconfig\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.489469 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489444 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a97f61f-0e17-4d8f-a8df-d01b9668db82-local-certs\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.489537 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489520 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-dns-cert\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.489598 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489553 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-ca-configmap\") pod \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\" (UID: \"8a97f61f-0e17-4d8f-a8df-d01b9668db82\") " Apr 16 20:03:08.490030 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.489983 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:03:08.491930 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.491895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:03:08.491930 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.491918 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:03:08.492098 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.491947 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-cacerts" (OuterVolumeSpecName: "cacerts") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:03:08.492098 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.492047 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a97f61f-0e17-4d8f-a8df-d01b9668db82-local-certs" (OuterVolumeSpecName: "local-certs") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:03:08.492098 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.492087 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-kube-api-access-lp96l" (OuterVolumeSpecName: "kube-api-access-lp96l") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "kube-api-access-lp96l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:03:08.492198 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.492087 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-token" (OuterVolumeSpecName: "istio-token") pod "8a97f61f-0e17-4d8f-a8df-d01b9668db82" (UID: "8a97f61f-0e17-4d8f-a8df-d01b9668db82"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:03:08.590258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590231 2569 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-token\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:08.590258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590256 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lp96l\" (UniqueName: \"kubernetes.io/projected/8a97f61f-0e17-4d8f-a8df-d01b9668db82-kube-api-access-lp96l\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:08.590423 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590268 2569 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-kubeconfig\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:08.590423 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590277 2569 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a97f61f-0e17-4d8f-a8df-d01b9668db82-local-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:08.590423 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590288 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-dns-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:08.590423 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590297 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8a97f61f-0e17-4d8f-a8df-d01b9668db82-istio-csr-ca-configmap\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:08.590423 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:08.590305 2569 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8a97f61f-0e17-4d8f-a8df-d01b9668db82-cacerts\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:09.298944 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:09.298899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" event={"ID":"8a97f61f-0e17-4d8f-a8df-d01b9668db82","Type":"ContainerDied","Data":"0ed549b24953202411674796674e044cd06241a180b4a586a8df22de4e04cdd1"} Apr 16 20:03:09.298944 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:09.298928 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp" Apr 16 20:03:09.299506 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:09.298958 2569 scope.go:117] "RemoveContainer" containerID="a9f414aa0b9afc02855c640204d432a92b7130df82ebc447975f9c6bd5f084ce" Apr 16 20:03:09.321204 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:09.321177 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp"] Apr 16 20:03:09.324814 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:09.324789 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6jpsp"] Apr 16 20:03:10.371671 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:10.371636 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a97f61f-0e17-4d8f-a8df-d01b9668db82" path="/var/lib/kubelet/pods/8a97f61f-0e17-4d8f-a8df-d01b9668db82/volumes" Apr 16 20:03:11.106200 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.106165 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-pnt5q"] Apr 16 20:03:11.106557 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.106543 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a97f61f-0e17-4d8f-a8df-d01b9668db82" containerName="discovery" Apr 16 20:03:11.106615 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.106560 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a97f61f-0e17-4d8f-a8df-d01b9668db82" containerName="discovery" Apr 16 20:03:11.106656 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.106620 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a97f61f-0e17-4d8f-a8df-d01b9668db82" containerName="discovery" Apr 16 20:03:11.109188 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.109172 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.111619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.111600 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:03:11.112223 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.112204 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:03:11.112795 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.112776 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-d758m\"" Apr 16 20:03:11.113041 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.113004 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 20:03:11.118721 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.118696 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-pnt5q"] Apr 16 20:03:11.136530 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.136504 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-64f548bc46-qnnrb"] Apr 16 20:03:11.138738 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.138715 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.140829 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.140805 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 20:03:11.141110 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.141094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-l62p9\"" Apr 16 20:03:11.151415 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.151384 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-64f548bc46-qnnrb"] Apr 16 20:03:11.167539 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.167506 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-j2rp7"] Apr 16 20:03:11.170238 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.170204 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.173128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.173041 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:03:11.173128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.173081 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-7tcbm\"" Apr 16 20:03:11.178292 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.178270 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-j2rp7"] Apr 16 20:03:11.213932 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.213898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvml\" (UniqueName: \"kubernetes.io/projected/72600703-706e-4085-821e-1f77b50f91be-kube-api-access-vtvml\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.214076 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.213992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7h6l\" (UniqueName: \"kubernetes.io/projected/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-kube-api-access-v7h6l\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.214125 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.214096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.214125 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.214119 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.315132 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.315097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvml\" (UniqueName: \"kubernetes.io/projected/72600703-706e-4085-821e-1f77b50f91be-kube-api-access-vtvml\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.315132 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.315138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbg2z\" (UniqueName: \"kubernetes.io/projected/071e9ff3-ae9d-4c24-8f40-eb1290ff97ae-kube-api-access-lbg2z\") pod \"seaweedfs-86cc847c5c-j2rp7\" (UID: \"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae\") " pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.315379 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.315167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7h6l\" (UniqueName: \"kubernetes.io/projected/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-kube-api-access-v7h6l\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.315379 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.315230 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/071e9ff3-ae9d-4c24-8f40-eb1290ff97ae-data\") pod \"seaweedfs-86cc847c5c-j2rp7\" (UID: \"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae\") " pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.315379 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.315295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.315379 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.315330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.315639 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:03:11.315418 2569 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 20:03:11.315639 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:03:11.315446 2569 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 20:03:11.315639 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:03:11.315482 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert podName:72600703-706e-4085-821e-1f77b50f91be nodeName:}" failed. No retries permitted until 2026-04-16 20:03:11.815464299 +0000 UTC m=+556.016174360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert") pod "kserve-controller-manager-659c8cbdc-pnt5q" (UID: "72600703-706e-4085-821e-1f77b50f91be") : secret "kserve-webhook-server-cert" not found Apr 16 20:03:11.315639 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:03:11.315500 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert podName:2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4 nodeName:}" failed. No retries permitted until 2026-04-16 20:03:11.815493907 +0000 UTC m=+556.016203962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert") pod "llmisvc-controller-manager-64f548bc46-qnnrb" (UID: "2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4") : secret "llmisvc-webhook-server-cert" not found Apr 16 20:03:11.323833 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.323808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7h6l\" (UniqueName: \"kubernetes.io/projected/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-kube-api-access-v7h6l\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.323968 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.323858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvml\" (UniqueName: \"kubernetes.io/projected/72600703-706e-4085-821e-1f77b50f91be-kube-api-access-vtvml\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.416150 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.416124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/071e9ff3-ae9d-4c24-8f40-eb1290ff97ae-data\") pod \"seaweedfs-86cc847c5c-j2rp7\" (UID: \"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae\") " pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.416559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.416229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbg2z\" (UniqueName: \"kubernetes.io/projected/071e9ff3-ae9d-4c24-8f40-eb1290ff97ae-kube-api-access-lbg2z\") pod \"seaweedfs-86cc847c5c-j2rp7\" (UID: \"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae\") " pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.416559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.416456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/071e9ff3-ae9d-4c24-8f40-eb1290ff97ae-data\") pod \"seaweedfs-86cc847c5c-j2rp7\" (UID: \"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae\") " pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.424380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.424358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbg2z\" (UniqueName: \"kubernetes.io/projected/071e9ff3-ae9d-4c24-8f40-eb1290ff97ae-kube-api-access-lbg2z\") pod \"seaweedfs-86cc847c5c-j2rp7\" (UID: \"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae\") " pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.483203 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.483176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:11.616903 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.616867 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-j2rp7"] Apr 16 20:03:11.618479 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:03:11.618454 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071e9ff3_ae9d_4c24_8f40_eb1290ff97ae.slice/crio-4f71eae92124217251767acb8b5e044c9b4048899ff6e2347ce960ac98a1b2b7 WatchSource:0}: Error finding container 4f71eae92124217251767acb8b5e044c9b4048899ff6e2347ce960ac98a1b2b7: Status 404 returned error can't find the container with id 4f71eae92124217251767acb8b5e044c9b4048899ff6e2347ce960ac98a1b2b7 Apr 16 20:03:11.819434 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.819335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:11.819434 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.819378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.821845 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.821811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert\") pod \"llmisvc-controller-manager-64f548bc46-qnnrb\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:11.821951 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:11.821935 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert\") pod \"kserve-controller-manager-659c8cbdc-pnt5q\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:12.021291 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:12.021250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:12.048699 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:12.048668 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:12.307280 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:12.307250 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-64f548bc46-qnnrb"] Apr 16 20:03:12.310861 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:03:12.310818 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2cb60e7d_f6b6_49d5_b394_d8c5d9420bf4.slice/crio-77fc53b80c13c04b127c40193f7cd1f912bbe024f45b3f0036b2285581930c30 WatchSource:0}: Error finding container 77fc53b80c13c04b127c40193f7cd1f912bbe024f45b3f0036b2285581930c30: Status 404 returned error can't find the container with id 77fc53b80c13c04b127c40193f7cd1f912bbe024f45b3f0036b2285581930c30 Apr 16 20:03:12.315979 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:12.315946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-j2rp7" event={"ID":"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae","Type":"ContainerStarted","Data":"4f71eae92124217251767acb8b5e044c9b4048899ff6e2347ce960ac98a1b2b7"} Apr 16 20:03:12.333787 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:12.333761 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-pnt5q"] Apr 16 20:03:12.337299 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:03:12.337267 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72600703_706e_4085_821e_1f77b50f91be.slice/crio-4344f4fbdc8593282b32ce9e36957c81078020b1818d4e1ec07193aa955fa6c8 WatchSource:0}: Error finding container 4344f4fbdc8593282b32ce9e36957c81078020b1818d4e1ec07193aa955fa6c8: Status 404 returned error can't find the container with id 4344f4fbdc8593282b32ce9e36957c81078020b1818d4e1ec07193aa955fa6c8 Apr 16 20:03:13.323181 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:13.323112 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" event={"ID":"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4","Type":"ContainerStarted","Data":"77fc53b80c13c04b127c40193f7cd1f912bbe024f45b3f0036b2285581930c30"} Apr 16 20:03:13.325402 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:13.325323 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" event={"ID":"72600703-706e-4085-821e-1f77b50f91be","Type":"ContainerStarted","Data":"4344f4fbdc8593282b32ce9e36957c81078020b1818d4e1ec07193aa955fa6c8"} Apr 16 20:03:17.342607 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.342574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" event={"ID":"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4","Type":"ContainerStarted","Data":"35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7"} Apr 16 20:03:17.343150 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.342645 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:17.343883 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.343860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" event={"ID":"72600703-706e-4085-821e-1f77b50f91be","Type":"ContainerStarted","Data":"383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409"} Apr 16 20:03:17.343987 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.343971 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:17.345046 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.345024 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-j2rp7" event={"ID":"071e9ff3-ae9d-4c24-8f40-eb1290ff97ae","Type":"ContainerStarted","Data":"dcf39cdffab81bcfb7cff449db025200e5b3038ec94e3e23bdf7346cc5e37b26"} Apr 16 20:03:17.345167 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.345153 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:17.357888 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.357850 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" podStartSLOduration=2.07882402 podStartE2EDuration="6.357838582s" podCreationTimestamp="2026-04-16 20:03:11 +0000 UTC" firstStartedPulling="2026-04-16 20:03:12.313714078 +0000 UTC m=+556.514424137" lastFinishedPulling="2026-04-16 20:03:16.59272864 +0000 UTC m=+560.793438699" observedRunningTime="2026-04-16 20:03:17.356402566 +0000 UTC m=+561.557112638" watchObservedRunningTime="2026-04-16 20:03:17.357838582 +0000 UTC m=+561.558548707" Apr 16 20:03:17.371703 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.371662 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" podStartSLOduration=2.816184669 podStartE2EDuration="6.371650889s" podCreationTimestamp="2026-04-16 20:03:11 +0000 UTC" firstStartedPulling="2026-04-16 20:03:12.338771719 +0000 UTC m=+556.539481775" lastFinishedPulling="2026-04-16 20:03:15.894237935 +0000 UTC m=+560.094947995" observedRunningTime="2026-04-16 20:03:17.370233285 +0000 UTC m=+561.570943363" watchObservedRunningTime="2026-04-16 20:03:17.371650889 +0000 UTC m=+561.572360967" Apr 16 20:03:17.386047 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:17.385999 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-j2rp7" podStartSLOduration=1.410363443 podStartE2EDuration="6.385988356s" podCreationTimestamp="2026-04-16 20:03:11 +0000 UTC" firstStartedPulling="2026-04-16 20:03:11.619535757 +0000 UTC m=+555.820245816" lastFinishedPulling="2026-04-16 20:03:16.595160657 +0000 UTC m=+560.795870729" observedRunningTime="2026-04-16 20:03:17.383509743 +0000 UTC m=+561.584219821" watchObservedRunningTime="2026-04-16 20:03:17.385988356 +0000 UTC m=+561.586698451" Apr 16 20:03:23.354789 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:23.354760 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-j2rp7" Apr 16 20:03:48.354301 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:48.354269 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:03:48.357203 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:48.357180 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:49.538460 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.538427 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-pnt5q"] Apr 16 20:03:49.538857 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.538646 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" podUID="72600703-706e-4085-821e-1f77b50f91be" containerName="manager" containerID="cri-o://383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409" gracePeriod=10 Apr 16 20:03:49.561247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.561222 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-z5xll"] Apr 16 20:03:49.564533 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.564516 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.579676 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.579651 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-z5xll"] Apr 16 20:03:49.641561 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.641537 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw8lf\" (UniqueName: \"kubernetes.io/projected/88c60637-16c9-442b-badf-a5f053bc0cdf-kube-api-access-qw8lf\") pod \"kserve-controller-manager-659c8cbdc-z5xll\" (UID: \"88c60637-16c9-442b-badf-a5f053bc0cdf\") " pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.641676 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.641592 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c60637-16c9-442b-badf-a5f053bc0cdf-cert\") pod \"kserve-controller-manager-659c8cbdc-z5xll\" (UID: \"88c60637-16c9-442b-badf-a5f053bc0cdf\") " pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.742873 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.742842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw8lf\" (UniqueName: \"kubernetes.io/projected/88c60637-16c9-442b-badf-a5f053bc0cdf-kube-api-access-qw8lf\") pod \"kserve-controller-manager-659c8cbdc-z5xll\" (UID: \"88c60637-16c9-442b-badf-a5f053bc0cdf\") " pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.743044 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.742905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c60637-16c9-442b-badf-a5f053bc0cdf-cert\") pod \"kserve-controller-manager-659c8cbdc-z5xll\" (UID: \"88c60637-16c9-442b-badf-a5f053bc0cdf\") " pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.745521 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.745495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c60637-16c9-442b-badf-a5f053bc0cdf-cert\") pod \"kserve-controller-manager-659c8cbdc-z5xll\" (UID: \"88c60637-16c9-442b-badf-a5f053bc0cdf\") " pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.750999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.750969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw8lf\" (UniqueName: \"kubernetes.io/projected/88c60637-16c9-442b-badf-a5f053bc0cdf-kube-api-access-qw8lf\") pod \"kserve-controller-manager-659c8cbdc-z5xll\" (UID: \"88c60637-16c9-442b-badf-a5f053bc0cdf\") " pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.777533 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.777514 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:49.843784 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.843719 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert\") pod \"72600703-706e-4085-821e-1f77b50f91be\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " Apr 16 20:03:49.843910 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.843789 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtvml\" (UniqueName: \"kubernetes.io/projected/72600703-706e-4085-821e-1f77b50f91be-kube-api-access-vtvml\") pod \"72600703-706e-4085-821e-1f77b50f91be\" (UID: \"72600703-706e-4085-821e-1f77b50f91be\") " Apr 16 20:03:49.845887 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.845861 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert" (OuterVolumeSpecName: "cert") pod "72600703-706e-4085-821e-1f77b50f91be" (UID: "72600703-706e-4085-821e-1f77b50f91be"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:03:49.845991 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.845917 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72600703-706e-4085-821e-1f77b50f91be-kube-api-access-vtvml" (OuterVolumeSpecName: "kube-api-access-vtvml") pod "72600703-706e-4085-821e-1f77b50f91be" (UID: "72600703-706e-4085-821e-1f77b50f91be"). InnerVolumeSpecName "kube-api-access-vtvml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:03:49.920214 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.920177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:49.945121 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.945097 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72600703-706e-4085-821e-1f77b50f91be-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:49.945121 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:49.945121 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtvml\" (UniqueName: \"kubernetes.io/projected/72600703-706e-4085-821e-1f77b50f91be-kube-api-access-vtvml\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:03:50.037280 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.037252 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-z5xll"] Apr 16 20:03:50.038813 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:03:50.038783 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c60637_16c9_442b_badf_a5f053bc0cdf.slice/crio-0c3b2a64f81f918f6e4396e239f969231924f4efc66bc097d6682d6f000301a7 WatchSource:0}: Error finding container 0c3b2a64f81f918f6e4396e239f969231924f4efc66bc097d6682d6f000301a7: Status 404 returned error can't find the container with id 0c3b2a64f81f918f6e4396e239f969231924f4efc66bc097d6682d6f000301a7 Apr 16 20:03:50.464358 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.464318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" event={"ID":"88c60637-16c9-442b-badf-a5f053bc0cdf","Type":"ContainerStarted","Data":"bc03f415e4b801a3e7fe9e65487b11c64bc609a632fdc0952dcc54d443377ccf"} Apr 16 20:03:50.464358 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.464358 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" event={"ID":"88c60637-16c9-442b-badf-a5f053bc0cdf","Type":"ContainerStarted","Data":"0c3b2a64f81f918f6e4396e239f969231924f4efc66bc097d6682d6f000301a7"} Apr 16 20:03:50.464678 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.464448 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:03:50.465564 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.465541 2569 generic.go:358] "Generic (PLEG): container finished" podID="72600703-706e-4085-821e-1f77b50f91be" containerID="383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409" exitCode=0 Apr 16 20:03:50.465666 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.465589 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" Apr 16 20:03:50.465666 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.465602 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" event={"ID":"72600703-706e-4085-821e-1f77b50f91be","Type":"ContainerDied","Data":"383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409"} Apr 16 20:03:50.465666 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.465625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-pnt5q" event={"ID":"72600703-706e-4085-821e-1f77b50f91be","Type":"ContainerDied","Data":"4344f4fbdc8593282b32ce9e36957c81078020b1818d4e1ec07193aa955fa6c8"} Apr 16 20:03:50.465666 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.465645 2569 scope.go:117] "RemoveContainer" containerID="383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409" Apr 16 20:03:50.473986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.473965 2569 scope.go:117] "RemoveContainer" containerID="383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409" Apr 16 20:03:50.474268 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:03:50.474249 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409\": container with ID starting with 383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409 not found: ID does not exist" containerID="383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409" Apr 16 20:03:50.474343 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.474275 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409"} err="failed to get container status \"383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409\": rpc error: code = NotFound desc = could not find container \"383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409\": container with ID starting with 383af2127d896733e177c6be839e6579e9b711adec7901acaf6c863493237409 not found: ID does not exist" Apr 16 20:03:50.479755 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.479715 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" podStartSLOduration=1.1578350020000001 podStartE2EDuration="1.479700303s" podCreationTimestamp="2026-04-16 20:03:49 +0000 UTC" firstStartedPulling="2026-04-16 20:03:50.040096545 +0000 UTC m=+594.240806601" lastFinishedPulling="2026-04-16 20:03:50.361961835 +0000 UTC m=+594.562671902" observedRunningTime="2026-04-16 20:03:50.47836205 +0000 UTC m=+594.679072128" watchObservedRunningTime="2026-04-16 20:03:50.479700303 +0000 UTC m=+594.680410385" Apr 16 20:03:50.493135 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.493110 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-pnt5q"] Apr 16 20:03:50.498774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:50.498752 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-pnt5q"] Apr 16 20:03:52.370976 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:52.370943 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72600703-706e-4085-821e-1f77b50f91be" path="/var/lib/kubelet/pods/72600703-706e-4085-821e-1f77b50f91be/volumes" Apr 16 20:03:56.291513 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:56.291483 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:03:56.292067 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:03:56.292050 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:04:21.475757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:21.475729 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-z5xll" Apr 16 20:04:22.410817 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.410780 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-46vnv"] Apr 16 20:04:22.411184 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.411168 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72600703-706e-4085-821e-1f77b50f91be" containerName="manager" Apr 16 20:04:22.411256 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.411185 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="72600703-706e-4085-821e-1f77b50f91be" containerName="manager" Apr 16 20:04:22.411256 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.411254 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="72600703-706e-4085-821e-1f77b50f91be" containerName="manager" Apr 16 20:04:22.414269 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.414243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:22.416643 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.416618 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:04:22.416774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.416689 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-8trbt\"" Apr 16 20:04:22.425786 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.425763 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-46vnv"] Apr 16 20:04:22.428357 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.428333 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-66fmq"] Apr 16 20:04:22.431542 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.431522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:22.433690 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.433667 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:04:22.433977 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.433961 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-k48mk\"" Apr 16 20:04:22.444105 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.444077 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-66fmq"] Apr 16 20:04:22.515188 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.515147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gchlk\" (UniqueName: \"kubernetes.io/projected/f12ecef8-38fa-40e9-8bc9-14c24a45629e-kube-api-access-gchlk\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:22.515627 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.515217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ecef8-38fa-40e9-8bc9-14c24a45629e-tls-certs\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:22.616128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.616091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdece77-36f4-4852-8e16-243f458ee895-cert\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:22.616311 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.616142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gchlk\" (UniqueName: \"kubernetes.io/projected/f12ecef8-38fa-40e9-8bc9-14c24a45629e-kube-api-access-gchlk\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:22.616311 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.616200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srm2c\" (UniqueName: \"kubernetes.io/projected/9cdece77-36f4-4852-8e16-243f458ee895-kube-api-access-srm2c\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:22.616311 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.616244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ecef8-38fa-40e9-8bc9-14c24a45629e-tls-certs\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:22.616466 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:04:22.616376 2569 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 20:04:22.616466 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:04:22.616437 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12ecef8-38fa-40e9-8bc9-14c24a45629e-tls-certs podName:f12ecef8-38fa-40e9-8bc9-14c24a45629e nodeName:}" failed. No retries permitted until 2026-04-16 20:04:23.116421561 +0000 UTC m=+627.317131617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/f12ecef8-38fa-40e9-8bc9-14c24a45629e-tls-certs") pod "model-serving-api-86f7b4b499-46vnv" (UID: "f12ecef8-38fa-40e9-8bc9-14c24a45629e") : secret "model-serving-api-tls" not found Apr 16 20:04:22.625753 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.625730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gchlk\" (UniqueName: \"kubernetes.io/projected/f12ecef8-38fa-40e9-8bc9-14c24a45629e-kube-api-access-gchlk\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:22.717165 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.717078 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srm2c\" (UniqueName: \"kubernetes.io/projected/9cdece77-36f4-4852-8e16-243f458ee895-kube-api-access-srm2c\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:22.717317 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.717213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdece77-36f4-4852-8e16-243f458ee895-cert\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:22.717417 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:04:22.717325 2569 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 20:04:22.717417 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:04:22.717380 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cdece77-36f4-4852-8e16-243f458ee895-cert podName:9cdece77-36f4-4852-8e16-243f458ee895 nodeName:}" failed. No retries permitted until 2026-04-16 20:04:23.217362964 +0000 UTC m=+627.418073026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cdece77-36f4-4852-8e16-243f458ee895-cert") pod "odh-model-controller-696fc77849-66fmq" (UID: "9cdece77-36f4-4852-8e16-243f458ee895") : secret "odh-model-controller-webhook-cert" not found Apr 16 20:04:22.728739 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:22.728712 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srm2c\" (UniqueName: \"kubernetes.io/projected/9cdece77-36f4-4852-8e16-243f458ee895-kube-api-access-srm2c\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:23.121707 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.121627 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ecef8-38fa-40e9-8bc9-14c24a45629e-tls-certs\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:23.124030 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.123985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ecef8-38fa-40e9-8bc9-14c24a45629e-tls-certs\") pod \"model-serving-api-86f7b4b499-46vnv\" (UID: \"f12ecef8-38fa-40e9-8bc9-14c24a45629e\") " pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:23.222673 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.222646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdece77-36f4-4852-8e16-243f458ee895-cert\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:23.224933 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.224905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdece77-36f4-4852-8e16-243f458ee895-cert\") pod \"odh-model-controller-696fc77849-66fmq\" (UID: \"9cdece77-36f4-4852-8e16-243f458ee895\") " pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:23.326270 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.326243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:23.344945 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.344921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:23.458604 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.458542 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-46vnv"] Apr 16 20:04:23.458968 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:04:23.458943 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12ecef8_38fa_40e9_8bc9_14c24a45629e.slice/crio-a9adf17daf0753bd14b5482510aff51a5ca57f379b0f503a01f2552cffdf87d4 WatchSource:0}: Error finding container a9adf17daf0753bd14b5482510aff51a5ca57f379b0f503a01f2552cffdf87d4: Status 404 returned error can't find the container with id a9adf17daf0753bd14b5482510aff51a5ca57f379b0f503a01f2552cffdf87d4 Apr 16 20:04:23.480618 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.480592 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-66fmq"] Apr 16 20:04:23.481412 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:04:23.481374 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cdece77_36f4_4852_8e16_243f458ee895.slice/crio-ac4803f2026644c7bd1707740532845bcfe9fc53be27dad92bfa22fa6694a135 WatchSource:0}: Error finding container ac4803f2026644c7bd1707740532845bcfe9fc53be27dad92bfa22fa6694a135: Status 404 returned error can't find the container with id ac4803f2026644c7bd1707740532845bcfe9fc53be27dad92bfa22fa6694a135 Apr 16 20:04:23.586035 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.585993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-66fmq" event={"ID":"9cdece77-36f4-4852-8e16-243f458ee895","Type":"ContainerStarted","Data":"ac4803f2026644c7bd1707740532845bcfe9fc53be27dad92bfa22fa6694a135"} Apr 16 20:04:23.586980 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:23.586960 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-46vnv" event={"ID":"f12ecef8-38fa-40e9-8bc9-14c24a45629e","Type":"ContainerStarted","Data":"a9adf17daf0753bd14b5482510aff51a5ca57f379b0f503a01f2552cffdf87d4"} Apr 16 20:04:26.601632 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:26.601530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-66fmq" event={"ID":"9cdece77-36f4-4852-8e16-243f458ee895","Type":"ContainerStarted","Data":"8780f788deda59b94e8bc696da73294beace1e5c03650b3a0ce61b866e92a105"} Apr 16 20:04:26.602104 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:26.601633 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:26.602912 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:26.602882 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-46vnv" event={"ID":"f12ecef8-38fa-40e9-8bc9-14c24a45629e","Type":"ContainerStarted","Data":"7e860ac8baceaecf368de818eb85e879cd43015fbc7eea3ba5e4f810fc171241"} Apr 16 20:04:26.603056 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:26.602999 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:26.618985 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:26.618937 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-66fmq" podStartSLOduration=1.698472261 podStartE2EDuration="4.618926112s" podCreationTimestamp="2026-04-16 20:04:22 +0000 UTC" firstStartedPulling="2026-04-16 20:04:23.48260225 +0000 UTC m=+627.683312307" lastFinishedPulling="2026-04-16 20:04:26.4030561 +0000 UTC m=+630.603766158" observedRunningTime="2026-04-16 20:04:26.617534261 +0000 UTC m=+630.818244339" watchObservedRunningTime="2026-04-16 20:04:26.618926112 +0000 UTC m=+630.819636189" Apr 16 20:04:26.634892 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:26.634837 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-46vnv" podStartSLOduration=1.747369011 podStartE2EDuration="4.634825744s" podCreationTimestamp="2026-04-16 20:04:22 +0000 UTC" firstStartedPulling="2026-04-16 20:04:23.461354002 +0000 UTC m=+627.662064059" lastFinishedPulling="2026-04-16 20:04:26.348810728 +0000 UTC m=+630.549520792" observedRunningTime="2026-04-16 20:04:26.63284195 +0000 UTC m=+630.833552027" watchObservedRunningTime="2026-04-16 20:04:26.634825744 +0000 UTC m=+630.835535823" Apr 16 20:04:37.609518 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:37.609485 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-66fmq" Apr 16 20:04:37.611401 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:37.611379 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-46vnv" Apr 16 20:04:38.454928 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.454892 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jd2c5"] Apr 16 20:04:38.458313 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.458296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jd2c5" Apr 16 20:04:38.464681 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.464649 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jd2c5"] Apr 16 20:04:38.540641 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.540608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9hd\" (UniqueName: \"kubernetes.io/projected/e074c26e-4516-4e8d-81a2-476b027fe153-kube-api-access-tx9hd\") pod \"s3-init-jd2c5\" (UID: \"e074c26e-4516-4e8d-81a2-476b027fe153\") " pod="kserve/s3-init-jd2c5" Apr 16 20:04:38.641447 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.641400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9hd\" (UniqueName: \"kubernetes.io/projected/e074c26e-4516-4e8d-81a2-476b027fe153-kube-api-access-tx9hd\") pod \"s3-init-jd2c5\" (UID: \"e074c26e-4516-4e8d-81a2-476b027fe153\") " pod="kserve/s3-init-jd2c5" Apr 16 20:04:38.648997 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.648970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9hd\" (UniqueName: \"kubernetes.io/projected/e074c26e-4516-4e8d-81a2-476b027fe153-kube-api-access-tx9hd\") pod \"s3-init-jd2c5\" (UID: \"e074c26e-4516-4e8d-81a2-476b027fe153\") " pod="kserve/s3-init-jd2c5" Apr 16 20:04:38.769263 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.769178 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jd2c5" Apr 16 20:04:38.891817 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:38.891749 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jd2c5"] Apr 16 20:04:38.893973 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:04:38.893943 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode074c26e_4516_4e8d_81a2_476b027fe153.slice/crio-8d96cf052c1036e5bfdc502510514d0e804723ce2281722e5afa74bc8cf5b8ef WatchSource:0}: Error finding container 8d96cf052c1036e5bfdc502510514d0e804723ce2281722e5afa74bc8cf5b8ef: Status 404 returned error can't find the container with id 8d96cf052c1036e5bfdc502510514d0e804723ce2281722e5afa74bc8cf5b8ef Apr 16 20:04:39.653282 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:39.653237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jd2c5" event={"ID":"e074c26e-4516-4e8d-81a2-476b027fe153","Type":"ContainerStarted","Data":"8d96cf052c1036e5bfdc502510514d0e804723ce2281722e5afa74bc8cf5b8ef"} Apr 16 20:04:43.673413 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:43.673374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jd2c5" event={"ID":"e074c26e-4516-4e8d-81a2-476b027fe153","Type":"ContainerStarted","Data":"4e635949b7675e39043583566bc90bf2b7c887884cbdd67dc9c3ff1cfb8f9a3b"} Apr 16 20:04:43.687899 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:43.687810 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jd2c5" podStartSLOduration=1.2987681850000001 podStartE2EDuration="5.687796315s" podCreationTimestamp="2026-04-16 20:04:38 +0000 UTC" firstStartedPulling="2026-04-16 20:04:38.895632909 +0000 UTC m=+643.096342966" lastFinishedPulling="2026-04-16 20:04:43.284661038 +0000 UTC m=+647.485371096" observedRunningTime="2026-04-16 20:04:43.687748365 +0000 UTC m=+647.888458444" watchObservedRunningTime="2026-04-16 20:04:43.687796315 +0000 UTC m=+647.888506392" Apr 16 20:04:46.686051 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:46.685995 2569 generic.go:358] "Generic (PLEG): container finished" podID="e074c26e-4516-4e8d-81a2-476b027fe153" containerID="4e635949b7675e39043583566bc90bf2b7c887884cbdd67dc9c3ff1cfb8f9a3b" exitCode=0 Apr 16 20:04:46.686399 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:46.686068 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jd2c5" event={"ID":"e074c26e-4516-4e8d-81a2-476b027fe153","Type":"ContainerDied","Data":"4e635949b7675e39043583566bc90bf2b7c887884cbdd67dc9c3ff1cfb8f9a3b"} Apr 16 20:04:47.825862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:47.825837 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jd2c5" Apr 16 20:04:47.923914 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:47.923887 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx9hd\" (UniqueName: \"kubernetes.io/projected/e074c26e-4516-4e8d-81a2-476b027fe153-kube-api-access-tx9hd\") pod \"e074c26e-4516-4e8d-81a2-476b027fe153\" (UID: \"e074c26e-4516-4e8d-81a2-476b027fe153\") " Apr 16 20:04:47.925863 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:47.925830 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e074c26e-4516-4e8d-81a2-476b027fe153-kube-api-access-tx9hd" (OuterVolumeSpecName: "kube-api-access-tx9hd") pod "e074c26e-4516-4e8d-81a2-476b027fe153" (UID: "e074c26e-4516-4e8d-81a2-476b027fe153"). InnerVolumeSpecName "kube-api-access-tx9hd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:04:48.025341 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:48.025277 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tx9hd\" (UniqueName: \"kubernetes.io/projected/e074c26e-4516-4e8d-81a2-476b027fe153-kube-api-access-tx9hd\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:04:48.694914 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:48.694880 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jd2c5" Apr 16 20:04:48.695104 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:48.694918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jd2c5" event={"ID":"e074c26e-4516-4e8d-81a2-476b027fe153","Type":"ContainerDied","Data":"8d96cf052c1036e5bfdc502510514d0e804723ce2281722e5afa74bc8cf5b8ef"} Apr 16 20:04:48.695104 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:48.694957 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d96cf052c1036e5bfdc502510514d0e804723ce2281722e5afa74bc8cf5b8ef" Apr 16 20:04:58.909322 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.909283 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99"] Apr 16 20:04:58.909872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.909753 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e074c26e-4516-4e8d-81a2-476b027fe153" containerName="s3-init" Apr 16 20:04:58.909872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.909772 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e074c26e-4516-4e8d-81a2-476b027fe153" containerName="s3-init" Apr 16 20:04:58.909872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.909844 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e074c26e-4516-4e8d-81a2-476b027fe153" containerName="s3-init" Apr 16 20:04:58.923819 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.923797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:58.926619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.926591 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:04:58.926843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.926612 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:04:58.927130 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.926976 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99"] Apr 16 20:04:58.927130 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.926608 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 20:04:58.927130 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:58.926711 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-4k892\"" Apr 16 20:04:59.017316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017323 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017567 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017567 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017567 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.017715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.017704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszc6\" (UniqueName: \"kubernetes.io/projected/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-kube-api-access-sszc6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118291 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sszc6\" (UniqueName: \"kubernetes.io/projected/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-kube-api-access-sszc6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118421 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118518 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.118565 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.119058 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.119058 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.118893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.119058 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.119044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.119262 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.119244 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.119340 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.119316 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.121073 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.121056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.121307 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.121283 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.128900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.128869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.131568 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.131540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszc6\" (UniqueName: \"kubernetes.io/projected/a29a5ff4-0cb8-4250-83fc-e9c21ac602db-kube-api-access-sszc6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-sxm99\" (UID: \"a29a5ff4-0cb8-4250-83fc-e9c21ac602db\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.237323 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.237235 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:04:59.380421 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.380392 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99"] Apr 16 20:04:59.381749 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:04:59.381720 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda29a5ff4_0cb8_4250_83fc_e9c21ac602db.slice/crio-702117db3f5288b5d9a969ca6560ac404440ad00e8124f7713582d046206282f WatchSource:0}: Error finding container 702117db3f5288b5d9a969ca6560ac404440ad00e8124f7713582d046206282f: Status 404 returned error can't find the container with id 702117db3f5288b5d9a969ca6560ac404440ad00e8124f7713582d046206282f Apr 16 20:04:59.383701 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.383683 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:04:59.735814 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:04:59.735778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" event={"ID":"a29a5ff4-0cb8-4250-83fc-e9c21ac602db","Type":"ContainerStarted","Data":"702117db3f5288b5d9a969ca6560ac404440ad00e8124f7713582d046206282f"} Apr 16 20:05:03.902999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:03.902962 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:05:03.903244 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:03.903053 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:05:03.903244 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:03.903085 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:05:04.757371 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:04.757335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" event={"ID":"a29a5ff4-0cb8-4250-83fc-e9c21ac602db","Type":"ContainerStarted","Data":"d883b296d6ddbe0332914d7e500a29af248ed22d53472968a00ecb213ae8d66f"} Apr 16 20:05:04.776956 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:04.776896 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" podStartSLOduration=2.257948851 podStartE2EDuration="6.776876386s" podCreationTimestamp="2026-04-16 20:04:58 +0000 UTC" firstStartedPulling="2026-04-16 20:04:59.383824361 +0000 UTC m=+663.584534420" lastFinishedPulling="2026-04-16 20:05:03.902751896 +0000 UTC m=+668.103461955" observedRunningTime="2026-04-16 20:05:04.775291362 +0000 UTC m=+668.976001440" watchObservedRunningTime="2026-04-16 20:05:04.776876386 +0000 UTC m=+668.977586465" Apr 16 20:05:05.237807 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:05.237775 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:05:05.242259 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:05.242237 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:05:05.761928 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:05.761894 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:05:05.763099 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:05.763067 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-sxm99" Apr 16 20:05:10.613837 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.613801 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh"] Apr 16 20:05:10.617821 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.617797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.620258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.620237 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 20:05:10.620333 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.620273 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pqxtj\"" Apr 16 20:05:10.628219 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.628195 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh"] Apr 16 20:05:10.722639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.722600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.722785 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.722647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-model-cache\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.722785 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.722736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2nl\" (UniqueName: \"kubernetes.io/projected/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kube-api-access-7c2nl\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.722785 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.722774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-dshm\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.722921 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.722836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.722921 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.722890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-home\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.823705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.823650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-model-cache\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.823885 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.823746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2nl\" (UniqueName: \"kubernetes.io/projected/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kube-api-access-7c2nl\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.823885 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.823788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-dshm\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.823885 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.823841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.824053 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.823900 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-home\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.824053 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.823930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.824178 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.824155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-model-cache\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.824239 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.824218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-home\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.824382 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.824273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.826258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.826235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-dshm\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.826405 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.826385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.831999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.831973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2nl\" (UniqueName: \"kubernetes.io/projected/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kube-api-access-7c2nl\") pod \"scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:10.929949 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:10.929902 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:11.052741 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.052710 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh"] Apr 16 20:05:11.053888 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:05:11.053851 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea34e91_d306_41b1_a697_1c7c6df2a2dd.slice/crio-62ec07ebe005448317701eede4d3badba65ce0d54717f7b6958aebfe8c410e19 WatchSource:0}: Error finding container 62ec07ebe005448317701eede4d3badba65ce0d54717f7b6958aebfe8c410e19: Status 404 returned error can't find the container with id 62ec07ebe005448317701eede4d3badba65ce0d54717f7b6958aebfe8c410e19 Apr 16 20:05:11.118627 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.118593 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6"] Apr 16 20:05:11.123528 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.123501 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.127053 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.126994 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-2b7vc\"" Apr 16 20:05:11.134691 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.134666 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6"] Apr 16 20:05:11.227540 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.227449 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s66bx\" (UniqueName: \"kubernetes.io/projected/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kube-api-access-s66bx\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.227540 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.227497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.227763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.227633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.227763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.227676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.227763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.227734 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.227763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.227753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328278 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328466 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328538 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328600 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328532 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328600 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328696 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s66bx\" (UniqueName: \"kubernetes.io/projected/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kube-api-access-s66bx\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328696 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328794 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328865 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328846 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.328920 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.328907 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.330935 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.330915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.336938 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.336913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s66bx\" (UniqueName: \"kubernetes.io/projected/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kube-api-access-s66bx\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.434561 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.434530 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:11.571981 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.571952 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6"] Apr 16 20:05:11.572770 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:05:11.572738 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8bb6ec_d022_43e4_92ee_b5aa59710619.slice/crio-1889b55b7a3f85689d127001af5514ecb63ec062efbb7b70e092ade84640e9b6 WatchSource:0}: Error finding container 1889b55b7a3f85689d127001af5514ecb63ec062efbb7b70e092ade84640e9b6: Status 404 returned error can't find the container with id 1889b55b7a3f85689d127001af5514ecb63ec062efbb7b70e092ade84640e9b6 Apr 16 20:05:11.785717 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.785638 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerStarted","Data":"1889b55b7a3f85689d127001af5514ecb63ec062efbb7b70e092ade84640e9b6"} Apr 16 20:05:11.786842 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:11.786818 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" event={"ID":"8ea34e91-d306-41b1-a697-1c7c6df2a2dd","Type":"ContainerStarted","Data":"62ec07ebe005448317701eede4d3badba65ce0d54717f7b6958aebfe8c410e19"} Apr 16 20:05:15.804888 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:15.804847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" event={"ID":"8ea34e91-d306-41b1-a697-1c7c6df2a2dd","Type":"ContainerStarted","Data":"2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9"} Apr 16 20:05:15.806194 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:15.806166 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerStarted","Data":"e111d465d48f654f9449885111d2f1e5941da3afba3f07b73e4c5a4300a10efa"} Apr 16 20:05:16.811765 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:16.811724 2569 generic.go:358] "Generic (PLEG): container finished" podID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerID="e111d465d48f654f9449885111d2f1e5941da3afba3f07b73e4c5a4300a10efa" exitCode=0 Apr 16 20:05:16.812169 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:16.811821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerDied","Data":"e111d465d48f654f9449885111d2f1e5941da3afba3f07b73e4c5a4300a10efa"} Apr 16 20:05:18.825856 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:18.825811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerStarted","Data":"f1b3fbe5bcb34bd052b4dfa7028ec1d690b8c22a2dc2a168cea68a909ecc46e1"} Apr 16 20:05:20.836908 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:20.836868 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerID="2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9" exitCode=0 Apr 16 20:05:20.837383 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:20.836902 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" event={"ID":"8ea34e91-d306-41b1-a697-1c7c6df2a2dd","Type":"ContainerDied","Data":"2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9"} Apr 16 20:05:22.848911 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:22.848875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" event={"ID":"8ea34e91-d306-41b1-a697-1c7c6df2a2dd","Type":"ContainerStarted","Data":"8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed"} Apr 16 20:05:22.870946 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:22.870860 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" podStartSLOduration=1.848914699 podStartE2EDuration="12.87084077s" podCreationTimestamp="2026-04-16 20:05:10 +0000 UTC" firstStartedPulling="2026-04-16 20:05:11.055825747 +0000 UTC m=+675.256535805" lastFinishedPulling="2026-04-16 20:05:22.077751811 +0000 UTC m=+686.278461876" observedRunningTime="2026-04-16 20:05:22.86655806 +0000 UTC m=+687.067268138" watchObservedRunningTime="2026-04-16 20:05:22.87084077 +0000 UTC m=+687.071550882" Apr 16 20:05:30.930475 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:30.930430 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:30.930475 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:30.930486 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:30.948195 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:30.948166 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:31.901999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:31.901964 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:05:48.965762 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:48.965711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerStarted","Data":"39bea455ecf5bc23c57f1fbf82a5d16f35b653a6bb4a160b416b05ccd6564493"} Apr 16 20:05:48.966190 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:48.965957 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:48.968434 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:48.968412 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:48.990783 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:48.990723 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" podStartSLOduration=1.016821462 podStartE2EDuration="37.990709318s" podCreationTimestamp="2026-04-16 20:05:11 +0000 UTC" firstStartedPulling="2026-04-16 20:05:11.574732719 +0000 UTC m=+675.775442775" lastFinishedPulling="2026-04-16 20:05:48.548620575 +0000 UTC m=+712.749330631" observedRunningTime="2026-04-16 20:05:48.987201149 +0000 UTC m=+713.187911230" watchObservedRunningTime="2026-04-16 20:05:48.990709318 +0000 UTC m=+713.191419396" Apr 16 20:05:51.435387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:51.435355 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:05:51.435387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:05:51.435390 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:06:01.436321 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:06:01.436288 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:06:01.437552 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:06:01.437532 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:07:15.926733 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:15.926702 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6"] Apr 16 20:07:15.929509 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:15.929482 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:15.932385 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:15.932350 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 20:07:15.932492 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:15.932468 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-mb7qk\"" Apr 16 20:07:15.941852 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:15.941829 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6"] Apr 16 20:07:16.075593 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.075557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.075768 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.075602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbpt\" (UniqueName: \"kubernetes.io/projected/524d7681-e08f-4c42-9129-8bac7ade9924-kube-api-access-klbpt\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.075768 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.075716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.075846 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.075769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.075846 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.075806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/524d7681-e08f-4c42-9129-8bac7ade9924-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.075846 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.075833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klbpt\" (UniqueName: \"kubernetes.io/projected/524d7681-e08f-4c42-9129-8bac7ade9924-kube-api-access-klbpt\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177574 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177641 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177697 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/524d7681-e08f-4c42-9129-8bac7ade9924-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.177763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.178569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.178144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.179522 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.179489 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.179754 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.179733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.184859 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.184835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/524d7681-e08f-4c42-9129-8bac7ade9924-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.188406 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.188382 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbpt\" (UniqueName: \"kubernetes.io/projected/524d7681-e08f-4c42-9129-8bac7ade9924-kube-api-access-klbpt\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.239838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.239815 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:16.366764 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:16.366737 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6"] Apr 16 20:07:16.368698 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:07:16.368663 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524d7681_e08f_4c42_9129_8bac7ade9924.slice/crio-f7e595c0a894fb09ee33aeb955664f0637e3ba42b22a29f3d870ad653dc7fa35 WatchSource:0}: Error finding container f7e595c0a894fb09ee33aeb955664f0637e3ba42b22a29f3d870ad653dc7fa35: Status 404 returned error can't find the container with id f7e595c0a894fb09ee33aeb955664f0637e3ba42b22a29f3d870ad653dc7fa35 Apr 16 20:07:17.300555 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:17.300523 2569 generic.go:358] "Generic (PLEG): container finished" podID="524d7681-e08f-4c42-9129-8bac7ade9924" containerID="b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195" exitCode=0 Apr 16 20:07:17.300901 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:17.300613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerDied","Data":"b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195"} Apr 16 20:07:17.300901 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:17.300652 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerStarted","Data":"f7e595c0a894fb09ee33aeb955664f0637e3ba42b22a29f3d870ad653dc7fa35"} Apr 16 20:07:18.306406 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:18.306372 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerStarted","Data":"94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860"} Apr 16 20:07:18.306406 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:18.306407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerStarted","Data":"93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2"} Apr 16 20:07:18.306869 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:18.306496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:18.327058 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:18.326996 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" podStartSLOduration=3.326981054 podStartE2EDuration="3.326981054s" podCreationTimestamp="2026-04-16 20:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:07:18.325114126 +0000 UTC m=+802.525824203" watchObservedRunningTime="2026-04-16 20:07:18.326981054 +0000 UTC m=+802.527691131" Apr 16 20:07:26.240949 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:26.240903 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:26.241445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:26.241032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:26.243831 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:26.243806 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:26.339445 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:26.339416 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:07:48.354177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:07:48.354147 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:08:56.324671 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:08:56.324583 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:08:56.328438 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:08:56.328417 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:10:21.624946 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:21.624914 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6"] Apr 16 20:10:21.625507 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:21.625239 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="main" containerID="cri-o://93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2" gracePeriod=30 Apr 16 20:10:21.625507 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:21.625298 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="tokenizer" containerID="cri-o://94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860" gracePeriod=30 Apr 16 20:10:22.047569 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.047532 2569 generic.go:358] "Generic (PLEG): container finished" podID="524d7681-e08f-4c42-9129-8bac7ade9924" containerID="93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2" exitCode=0 Apr 16 20:10:22.047738 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.047607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerDied","Data":"93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2"} Apr 16 20:10:22.887774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.887750 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:10:22.947737 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.947636 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-tmp\") pod \"524d7681-e08f-4c42-9129-8bac7ade9924\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " Apr 16 20:10:22.947737 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.947692 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/524d7681-e08f-4c42-9129-8bac7ade9924-tls-certs\") pod \"524d7681-e08f-4c42-9129-8bac7ade9924\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " Apr 16 20:10:22.947967 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.947750 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-kserve-provision-location\") pod \"524d7681-e08f-4c42-9129-8bac7ade9924\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " Apr 16 20:10:22.947967 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.947788 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbpt\" (UniqueName: \"kubernetes.io/projected/524d7681-e08f-4c42-9129-8bac7ade9924-kube-api-access-klbpt\") pod \"524d7681-e08f-4c42-9129-8bac7ade9924\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " Apr 16 20:10:22.947967 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.947831 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-cache\") pod \"524d7681-e08f-4c42-9129-8bac7ade9924\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " Apr 16 20:10:22.947967 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.947886 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-uds\") pod \"524d7681-e08f-4c42-9129-8bac7ade9924\" (UID: \"524d7681-e08f-4c42-9129-8bac7ade9924\") " Apr 16 20:10:22.948198 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.948072 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "524d7681-e08f-4c42-9129-8bac7ade9924" (UID: "524d7681-e08f-4c42-9129-8bac7ade9924"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:22.948198 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.948154 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "524d7681-e08f-4c42-9129-8bac7ade9924" (UID: "524d7681-e08f-4c42-9129-8bac7ade9924"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:22.948309 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.948249 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:10:22.948309 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.948270 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:10:22.948404 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.948352 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "524d7681-e08f-4c42-9129-8bac7ade9924" (UID: "524d7681-e08f-4c42-9129-8bac7ade9924"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:22.948529 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.948507 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "524d7681-e08f-4c42-9129-8bac7ade9924" (UID: "524d7681-e08f-4c42-9129-8bac7ade9924"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:22.950134 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.950097 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524d7681-e08f-4c42-9129-8bac7ade9924-kube-api-access-klbpt" (OuterVolumeSpecName: "kube-api-access-klbpt") pod "524d7681-e08f-4c42-9129-8bac7ade9924" (UID: "524d7681-e08f-4c42-9129-8bac7ade9924"). InnerVolumeSpecName "kube-api-access-klbpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:10:22.950238 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:22.950148 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524d7681-e08f-4c42-9129-8bac7ade9924-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "524d7681-e08f-4c42-9129-8bac7ade9924" (UID: "524d7681-e08f-4c42-9129-8bac7ade9924"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:10:23.049299 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.049271 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:10:23.049299 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.049298 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klbpt\" (UniqueName: \"kubernetes.io/projected/524d7681-e08f-4c42-9129-8bac7ade9924-kube-api-access-klbpt\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:10:23.049513 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.049310 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/524d7681-e08f-4c42-9129-8bac7ade9924-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:10:23.049513 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.049321 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/524d7681-e08f-4c42-9129-8bac7ade9924-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:10:23.052673 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.052645 2569 generic.go:358] "Generic (PLEG): container finished" podID="524d7681-e08f-4c42-9129-8bac7ade9924" containerID="94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860" exitCode=0 Apr 16 20:10:23.052809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.052711 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" Apr 16 20:10:23.052809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.052763 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerDied","Data":"94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860"} Apr 16 20:10:23.052809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.052802 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6" event={"ID":"524d7681-e08f-4c42-9129-8bac7ade9924","Type":"ContainerDied","Data":"f7e595c0a894fb09ee33aeb955664f0637e3ba42b22a29f3d870ad653dc7fa35"} Apr 16 20:10:23.052990 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.052819 2569 scope.go:117] "RemoveContainer" containerID="94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860" Apr 16 20:10:23.062416 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.062398 2569 scope.go:117] "RemoveContainer" containerID="93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2" Apr 16 20:10:23.070245 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.070226 2569 scope.go:117] "RemoveContainer" containerID="b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195" Apr 16 20:10:23.075946 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.075926 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6"] Apr 16 20:10:23.078897 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.078872 2569 scope.go:117] "RemoveContainer" containerID="94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860" Apr 16 20:10:23.079201 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:10:23.079173 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860\": container with ID starting with 94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860 not found: ID does not exist" containerID="94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860" Apr 16 20:10:23.079275 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.079212 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860"} err="failed to get container status \"94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860\": rpc error: code = NotFound desc = could not find container \"94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860\": container with ID starting with 94cc855fbaf19ee9f1d36a1349f6e1ea8132f68977eaf5d9c8c7744ea56dc860 not found: ID does not exist" Apr 16 20:10:23.079275 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.079231 2569 scope.go:117] "RemoveContainer" containerID="93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2" Apr 16 20:10:23.079491 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:10:23.079475 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2\": container with ID starting with 93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2 not found: ID does not exist" containerID="93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2" Apr 16 20:10:23.079536 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.079495 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2"} err="failed to get container status \"93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2\": rpc error: code = NotFound desc = could not find container \"93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2\": container with ID starting with 93704e47e60de3394d2baa503ae8a1f601ecc30a7ae730bd626adcf599b4c3c2 not found: ID does not exist" Apr 16 20:10:23.079536 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.079509 2569 scope.go:117] "RemoveContainer" containerID="b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195" Apr 16 20:10:23.079726 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:10:23.079711 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195\": container with ID starting with b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195 not found: ID does not exist" containerID="b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195" Apr 16 20:10:23.079770 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.079730 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195"} err="failed to get container status \"b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195\": rpc error: code = NotFound desc = could not find container \"b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195\": container with ID starting with b18f2258a9666a46397cd1db45a09bb98189bc85f7be3aaa0752eea17a2a2195 not found: ID does not exist" Apr 16 20:10:23.080206 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:23.080187 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegkpz6"] Apr 16 20:10:24.371822 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:24.371789 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" path="/var/lib/kubelet/pods/524d7681-e08f-4c42-9129-8bac7ade9924/volumes" Apr 16 20:10:25.478183 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478143 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d"] Apr 16 20:10:25.478558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478516 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="storage-initializer" Apr 16 20:10:25.478558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478527 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="storage-initializer" Apr 16 20:10:25.478558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478540 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="main" Apr 16 20:10:25.478558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478546 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="main" Apr 16 20:10:25.478558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478554 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="tokenizer" Apr 16 20:10:25.478558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478561 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="tokenizer" Apr 16 20:10:25.478749 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478617 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="tokenizer" Apr 16 20:10:25.478749 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.478626 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="524d7681-e08f-4c42-9129-8bac7ade9924" containerName="main" Apr 16 20:10:25.484545 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.484522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.486786 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.486760 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 20:10:25.486923 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.486791 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-b5pwn\"" Apr 16 20:10:25.496535 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.496509 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d"] Apr 16 20:10:25.570168 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.570091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.570402 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.570287 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.570497 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.570461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fbg\" (UniqueName: \"kubernetes.io/projected/d2903c12-fe50-4525-b536-2a8053d8d81f-kube-api-access-w4fbg\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.570556 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.570509 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2903c12-fe50-4525-b536-2a8053d8d81f-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.570609 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.570559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.570667 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.570607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.671763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.671725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fbg\" (UniqueName: \"kubernetes.io/projected/d2903c12-fe50-4525-b536-2a8053d8d81f-kube-api-access-w4fbg\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.671763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.671770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2903c12-fe50-4525-b536-2a8053d8d81f-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672067 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672162 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672162 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672268 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672204 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672448 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672530 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672507 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672588 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.672588 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.672565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.674528 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.674506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2903c12-fe50-4525-b536-2a8053d8d81f-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.680143 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.680126 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fbg\" (UniqueName: \"kubernetes.io/projected/d2903c12-fe50-4525-b536-2a8053d8d81f-kube-api-access-w4fbg\") pod \"custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.797080 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.796987 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:25.929872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.929824 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d"] Apr 16 20:10:25.932342 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:10:25.932312 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2903c12_fe50_4525_b536_2a8053d8d81f.slice/crio-cf140a7eca504c3874d626d1dcaefcab904d7ffe10ecbb5b7678cc7b414a943c WatchSource:0}: Error finding container cf140a7eca504c3874d626d1dcaefcab904d7ffe10ecbb5b7678cc7b414a943c: Status 404 returned error can't find the container with id cf140a7eca504c3874d626d1dcaefcab904d7ffe10ecbb5b7678cc7b414a943c Apr 16 20:10:25.934310 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:25.934293 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:10:26.067399 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:26.067304 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerStarted","Data":"9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23"} Apr 16 20:10:26.067399 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:26.067348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerStarted","Data":"cf140a7eca504c3874d626d1dcaefcab904d7ffe10ecbb5b7678cc7b414a943c"} Apr 16 20:10:27.072750 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:27.072663 2569 generic.go:358] "Generic (PLEG): container finished" podID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerID="9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23" exitCode=0 Apr 16 20:10:27.073211 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:27.072748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerDied","Data":"9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23"} Apr 16 20:10:28.079377 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:28.079341 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerStarted","Data":"33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77"} Apr 16 20:10:28.079732 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:28.079385 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerStarted","Data":"858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d"} Apr 16 20:10:28.079732 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:28.079427 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:28.101608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:28.101559 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" podStartSLOduration=3.101546538 podStartE2EDuration="3.101546538s" podCreationTimestamp="2026-04-16 20:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:10:28.099573429 +0000 UTC m=+992.300283524" watchObservedRunningTime="2026-04-16 20:10:28.101546538 +0000 UTC m=+992.302256616" Apr 16 20:10:35.797720 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:35.797680 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:35.798223 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:35.797736 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:35.800315 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:35.800287 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:36.118557 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:36.118465 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:10:57.123204 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:10:57.123171 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:12:11.669879 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:11.669799 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d"] Apr 16 20:12:11.670405 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:11.670130 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="main" containerID="cri-o://858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d" gracePeriod=30 Apr 16 20:12:11.670405 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:11.670155 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="tokenizer" containerID="cri-o://33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77" gracePeriod=30 Apr 16 20:12:12.494922 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:12.494888 2569 generic.go:358] "Generic (PLEG): container finished" podID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerID="858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d" exitCode=0 Apr 16 20:12:12.495111 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:12.494931 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerDied","Data":"858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d"} Apr 16 20:12:12.927849 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:12.927825 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:12:13.015167 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015133 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-cache\") pod \"d2903c12-fe50-4525-b536-2a8053d8d81f\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " Apr 16 20:12:13.015323 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015176 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2903c12-fe50-4525-b536-2a8053d8d81f-tls-certs\") pod \"d2903c12-fe50-4525-b536-2a8053d8d81f\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " Apr 16 20:12:13.015323 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015223 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-uds\") pod \"d2903c12-fe50-4525-b536-2a8053d8d81f\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " Apr 16 20:12:13.015323 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015256 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4fbg\" (UniqueName: \"kubernetes.io/projected/d2903c12-fe50-4525-b536-2a8053d8d81f-kube-api-access-w4fbg\") pod \"d2903c12-fe50-4525-b536-2a8053d8d81f\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " Apr 16 20:12:13.015323 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015315 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-kserve-provision-location\") pod \"d2903c12-fe50-4525-b536-2a8053d8d81f\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " Apr 16 20:12:13.015525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015342 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-tmp\") pod \"d2903c12-fe50-4525-b536-2a8053d8d81f\" (UID: \"d2903c12-fe50-4525-b536-2a8053d8d81f\") " Apr 16 20:12:13.015525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015401 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d2903c12-fe50-4525-b536-2a8053d8d81f" (UID: "d2903c12-fe50-4525-b536-2a8053d8d81f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:13.015525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.015471 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d2903c12-fe50-4525-b536-2a8053d8d81f" (UID: "d2903c12-fe50-4525-b536-2a8053d8d81f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:13.017775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.016235 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d2903c12-fe50-4525-b536-2a8053d8d81f" (UID: "d2903c12-fe50-4525-b536-2a8053d8d81f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:13.017775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.016298 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:13.017775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.016327 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:13.017775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.016506 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d2903c12-fe50-4525-b536-2a8053d8d81f" (UID: "d2903c12-fe50-4525-b536-2a8053d8d81f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:13.021064 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.018258 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2903c12-fe50-4525-b536-2a8053d8d81f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d2903c12-fe50-4525-b536-2a8053d8d81f" (UID: "d2903c12-fe50-4525-b536-2a8053d8d81f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:12:13.023812 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.023785 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2903c12-fe50-4525-b536-2a8053d8d81f-kube-api-access-w4fbg" (OuterVolumeSpecName: "kube-api-access-w4fbg") pod "d2903c12-fe50-4525-b536-2a8053d8d81f" (UID: "d2903c12-fe50-4525-b536-2a8053d8d81f"). InnerVolumeSpecName "kube-api-access-w4fbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:12:13.117450 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.117355 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:13.117450 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.117388 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2903c12-fe50-4525-b536-2a8053d8d81f-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:13.117450 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.117407 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2903c12-fe50-4525-b536-2a8053d8d81f-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:13.117450 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.117417 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4fbg\" (UniqueName: \"kubernetes.io/projected/d2903c12-fe50-4525-b536-2a8053d8d81f-kube-api-access-w4fbg\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:13.500240 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.500196 2569 generic.go:358] "Generic (PLEG): container finished" podID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerID="33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77" exitCode=0 Apr 16 20:12:13.500411 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.500285 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerDied","Data":"33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77"} Apr 16 20:12:13.500411 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.500325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" event={"ID":"d2903c12-fe50-4525-b536-2a8053d8d81f","Type":"ContainerDied","Data":"cf140a7eca504c3874d626d1dcaefcab904d7ffe10ecbb5b7678cc7b414a943c"} Apr 16 20:12:13.500411 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.500294 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d" Apr 16 20:12:13.500411 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.500346 2569 scope.go:117] "RemoveContainer" containerID="33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77" Apr 16 20:12:13.509650 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.509628 2569 scope.go:117] "RemoveContainer" containerID="858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d" Apr 16 20:12:13.517610 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.517591 2569 scope.go:117] "RemoveContainer" containerID="9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23" Apr 16 20:12:13.523186 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.523159 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d"] Apr 16 20:12:13.526642 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.526618 2569 scope.go:117] "RemoveContainer" containerID="33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77" Apr 16 20:12:13.526718 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.526669 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-f4f9b7b-v8q4d"] Apr 16 20:12:13.526907 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:12:13.526891 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77\": container with ID starting with 33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77 not found: ID does not exist" containerID="33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77" Apr 16 20:12:13.526973 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.526919 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77"} err="failed to get container status \"33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77\": rpc error: code = NotFound desc = could not find container \"33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77\": container with ID starting with 33a619c200c229f271ee638f1a60a04a81c0a95b5cb5393b69f228a1b4989d77 not found: ID does not exist" Apr 16 20:12:13.526973 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.526946 2569 scope.go:117] "RemoveContainer" containerID="858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d" Apr 16 20:12:13.527240 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:12:13.527219 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d\": container with ID starting with 858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d not found: ID does not exist" containerID="858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d" Apr 16 20:12:13.527282 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.527246 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d"} err="failed to get container status \"858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d\": rpc error: code = NotFound desc = could not find container \"858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d\": container with ID starting with 858e78cc3dcf089e6f3ff879a71a9b718a4b7908108b63dee6acb3ffdcfbf48d not found: ID does not exist" Apr 16 20:12:13.527282 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.527262 2569 scope.go:117] "RemoveContainer" containerID="9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23" Apr 16 20:12:13.527456 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:12:13.527437 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23\": container with ID starting with 9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23 not found: ID does not exist" containerID="9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23" Apr 16 20:12:13.527499 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:13.527460 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23"} err="failed to get container status \"9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23\": rpc error: code = NotFound desc = could not find container \"9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23\": container with ID starting with 9b04a9837a6ed252ac6fa1d5f240a4d3ff6b2b2c3be9bd475349a59f0b305b23 not found: ID does not exist" Apr 16 20:12:14.372675 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:14.372645 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" path="/var/lib/kubelet/pods/d2903c12-fe50-4525-b536-2a8053d8d81f/volumes" Apr 16 20:12:17.794538 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794499 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp"] Apr 16 20:12:17.794907 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794876 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="tokenizer" Apr 16 20:12:17.794907 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794888 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="tokenizer" Apr 16 20:12:17.794907 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794898 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="storage-initializer" Apr 16 20:12:17.794907 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794904 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="storage-initializer" Apr 16 20:12:17.795084 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794923 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="main" Apr 16 20:12:17.795084 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794929 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="main" Apr 16 20:12:17.795084 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794986 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="tokenizer" Apr 16 20:12:17.795084 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.794995 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2903c12-fe50-4525-b536-2a8053d8d81f" containerName="main" Apr 16 20:12:17.800067 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.800040 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.802449 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.802426 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-lkptg\"" Apr 16 20:12:17.802562 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.802464 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 20:12:17.809615 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.809587 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp"] Apr 16 20:12:17.862027 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.861982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.862189 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.862044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgj7\" (UniqueName: \"kubernetes.io/projected/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kube-api-access-bdgj7\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.862189 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.862070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.862189 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.862099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.862189 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.862170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.862325 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.862211 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.962872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.962829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963059 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.962914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgj7\" (UniqueName: \"kubernetes.io/projected/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kube-api-access-bdgj7\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963059 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.962962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963059 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.962998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963059 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.963053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963279 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.963096 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.963466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963600 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.963488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963600 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.963547 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.963699 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.963624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.965575 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.965555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:17.971552 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:17.971519 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgj7\" (UniqueName: \"kubernetes.io/projected/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kube-api-access-bdgj7\") pod \"router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:18.111065 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:18.110968 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:18.246796 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:18.246766 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp"] Apr 16 20:12:18.248378 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:12:18.248347 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0705857_9508_4fa5_9ae2_8d6dbffa13ea.slice/crio-0b1804bdd418e1a9f3a501205fda15f465cdeb99e71caa34ff653bc5f21c843e WatchSource:0}: Error finding container 0b1804bdd418e1a9f3a501205fda15f465cdeb99e71caa34ff653bc5f21c843e: Status 404 returned error can't find the container with id 0b1804bdd418e1a9f3a501205fda15f465cdeb99e71caa34ff653bc5f21c843e Apr 16 20:12:18.525467 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:18.525431 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerStarted","Data":"818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f"} Apr 16 20:12:18.525467 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:18.525473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerStarted","Data":"0b1804bdd418e1a9f3a501205fda15f465cdeb99e71caa34ff653bc5f21c843e"} Apr 16 20:12:19.231731 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.231692 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6"] Apr 16 20:12:19.232203 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.232146 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="main" containerID="cri-o://f1b3fbe5bcb34bd052b4dfa7028ec1d690b8c22a2dc2a168cea68a909ecc46e1" gracePeriod=30 Apr 16 20:12:19.232267 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.232199 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="tokenizer" containerID="cri-o://39bea455ecf5bc23c57f1fbf82a5d16f35b653a6bb4a160b416b05ccd6564493" gracePeriod=30 Apr 16 20:12:19.234629 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.234603 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh"] Apr 16 20:12:19.234904 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.234880 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerName="main" containerID="cri-o://8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed" gracePeriod=30 Apr 16 20:12:19.488668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.488644 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:12:19.532506 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.532464 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerID="8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed" exitCode=0 Apr 16 20:12:19.532705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.532523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" event={"ID":"8ea34e91-d306-41b1-a697-1c7c6df2a2dd","Type":"ContainerDied","Data":"8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed"} Apr 16 20:12:19.532705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.532558 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" Apr 16 20:12:19.532705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.532578 2569 scope.go:117] "RemoveContainer" containerID="8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed" Apr 16 20:12:19.532705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.532565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh" event={"ID":"8ea34e91-d306-41b1-a697-1c7c6df2a2dd","Type":"ContainerDied","Data":"62ec07ebe005448317701eede4d3badba65ce0d54717f7b6958aebfe8c410e19"} Apr 16 20:12:19.534801 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.534773 2569 generic.go:358] "Generic (PLEG): container finished" podID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerID="f1b3fbe5bcb34bd052b4dfa7028ec1d690b8c22a2dc2a168cea68a909ecc46e1" exitCode=0 Apr 16 20:12:19.534925 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.534846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerDied","Data":"f1b3fbe5bcb34bd052b4dfa7028ec1d690b8c22a2dc2a168cea68a909ecc46e1"} Apr 16 20:12:19.536289 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.536267 2569 generic.go:358] "Generic (PLEG): container finished" podID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerID="818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f" exitCode=0 Apr 16 20:12:19.536394 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.536355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerDied","Data":"818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f"} Apr 16 20:12:19.546144 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.546109 2569 scope.go:117] "RemoveContainer" containerID="2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9" Apr 16 20:12:19.578089 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578064 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-model-cache\") pod \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " Apr 16 20:12:19.578251 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578163 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-dshm\") pod \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " Apr 16 20:12:19.578251 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578207 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kserve-provision-location\") pod \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " Apr 16 20:12:19.578357 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578248 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-tls-certs\") pod \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " Apr 16 20:12:19.578357 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578269 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-home\") pod \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " Apr 16 20:12:19.578357 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578298 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c2nl\" (UniqueName: \"kubernetes.io/projected/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kube-api-access-7c2nl\") pod \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\" (UID: \"8ea34e91-d306-41b1-a697-1c7c6df2a2dd\") " Apr 16 20:12:19.578357 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578335 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-model-cache" (OuterVolumeSpecName: "model-cache") pod "8ea34e91-d306-41b1-a697-1c7c6df2a2dd" (UID: "8ea34e91-d306-41b1-a697-1c7c6df2a2dd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:19.578689 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578556 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-model-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:19.578689 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.578618 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-home" (OuterVolumeSpecName: "home") pod "8ea34e91-d306-41b1-a697-1c7c6df2a2dd" (UID: "8ea34e91-d306-41b1-a697-1c7c6df2a2dd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:19.580470 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.580438 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kube-api-access-7c2nl" (OuterVolumeSpecName: "kube-api-access-7c2nl") pod "8ea34e91-d306-41b1-a697-1c7c6df2a2dd" (UID: "8ea34e91-d306-41b1-a697-1c7c6df2a2dd"). InnerVolumeSpecName "kube-api-access-7c2nl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:12:19.580616 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.580600 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-dshm" (OuterVolumeSpecName: "dshm") pod "8ea34e91-d306-41b1-a697-1c7c6df2a2dd" (UID: "8ea34e91-d306-41b1-a697-1c7c6df2a2dd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:19.580689 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.580669 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8ea34e91-d306-41b1-a697-1c7c6df2a2dd" (UID: "8ea34e91-d306-41b1-a697-1c7c6df2a2dd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:12:19.612769 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.612742 2569 scope.go:117] "RemoveContainer" containerID="8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed" Apr 16 20:12:19.613151 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:12:19.613123 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed\": container with ID starting with 8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed not found: ID does not exist" containerID="8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed" Apr 16 20:12:19.613291 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.613163 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed"} err="failed to get container status \"8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed\": rpc error: code = NotFound desc = could not find container \"8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed\": container with ID starting with 8f25965c1f1f358ec29ee5937c85c454b1874363e23d6241163deb15968b02ed not found: ID does not exist" Apr 16 20:12:19.613291 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.613190 2569 scope.go:117] "RemoveContainer" containerID="2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9" Apr 16 20:12:19.613538 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:12:19.613501 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9\": container with ID starting with 2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9 not found: ID does not exist" containerID="2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9" Apr 16 20:12:19.613595 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.613547 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9"} err="failed to get container status \"2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9\": rpc error: code = NotFound desc = could not find container \"2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9\": container with ID starting with 2961d6f83368a8ad2abd53ff9eefea86371cf3979ee21ffd730d6420bff1ebd9 not found: ID does not exist" Apr 16 20:12:19.642510 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.642472 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ea34e91-d306-41b1-a697-1c7c6df2a2dd" (UID: "8ea34e91-d306-41b1-a697-1c7c6df2a2dd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:19.679456 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.679430 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-dshm\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:19.679577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.679460 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:19.679577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.679474 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:19.679577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.679491 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-home\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:19.679577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.679506 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7c2nl\" (UniqueName: \"kubernetes.io/projected/8ea34e91-d306-41b1-a697-1c7c6df2a2dd-kube-api-access-7c2nl\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:19.873152 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.873005 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh"] Apr 16 20:12:19.876366 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:19.876336 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-55f9bc6cf5-hf2wh"] Apr 16 20:12:20.372973 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.372937 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" path="/var/lib/kubelet/pods/8ea34e91-d306-41b1-a697-1c7c6df2a2dd/volumes" Apr 16 20:12:20.545349 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.545316 2569 generic.go:358] "Generic (PLEG): container finished" podID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerID="39bea455ecf5bc23c57f1fbf82a5d16f35b653a6bb4a160b416b05ccd6564493" exitCode=0 Apr 16 20:12:20.545526 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.545412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerDied","Data":"39bea455ecf5bc23c57f1fbf82a5d16f35b653a6bb4a160b416b05ccd6564493"} Apr 16 20:12:20.547630 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.547608 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerStarted","Data":"27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab"} Apr 16 20:12:20.547751 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.547637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerStarted","Data":"fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c"} Apr 16 20:12:20.547751 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.547701 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:20.570602 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.570551 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" podStartSLOduration=3.570538288 podStartE2EDuration="3.570538288s" podCreationTimestamp="2026-04-16 20:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:20.567517407 +0000 UTC m=+1104.768227486" watchObservedRunningTime="2026-04-16 20:12:20.570538288 +0000 UTC m=+1104.771248388" Apr 16 20:12:20.595449 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.595425 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:12:20.688999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.688964 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kserve-provision-location\") pod \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " Apr 16 20:12:20.689222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689078 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-tmp\") pod \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " Apr 16 20:12:20.689222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689103 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s66bx\" (UniqueName: \"kubernetes.io/projected/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kube-api-access-s66bx\") pod \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " Apr 16 20:12:20.689222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689119 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tls-certs\") pod \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " Apr 16 20:12:20.689222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689156 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-uds\") pod \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " Apr 16 20:12:20.689222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689179 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-cache\") pod \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\" (UID: \"7a8bb6ec-d022-43e4-92ee-b5aa59710619\") " Apr 16 20:12:20.689501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689384 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7a8bb6ec-d022-43e4-92ee-b5aa59710619" (UID: "7a8bb6ec-d022-43e4-92ee-b5aa59710619"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:20.689501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689393 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7a8bb6ec-d022-43e4-92ee-b5aa59710619" (UID: "7a8bb6ec-d022-43e4-92ee-b5aa59710619"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:20.689613 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689520 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:20.689613 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689538 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:20.689613 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689553 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7a8bb6ec-d022-43e4-92ee-b5aa59710619" (UID: "7a8bb6ec-d022-43e4-92ee-b5aa59710619"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:20.689936 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.689912 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7a8bb6ec-d022-43e4-92ee-b5aa59710619" (UID: "7a8bb6ec-d022-43e4-92ee-b5aa59710619"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:20.691362 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.691339 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kube-api-access-s66bx" (OuterVolumeSpecName: "kube-api-access-s66bx") pod "7a8bb6ec-d022-43e4-92ee-b5aa59710619" (UID: "7a8bb6ec-d022-43e4-92ee-b5aa59710619"). InnerVolumeSpecName "kube-api-access-s66bx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:12:20.691622 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.691603 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7a8bb6ec-d022-43e4-92ee-b5aa59710619" (UID: "7a8bb6ec-d022-43e4-92ee-b5aa59710619"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:12:20.790563 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.790519 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:20.790563 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.790551 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:20.790563 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.790562 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s66bx\" (UniqueName: \"kubernetes.io/projected/7a8bb6ec-d022-43e4-92ee-b5aa59710619-kube-api-access-s66bx\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:20.790563 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:20.790572 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8bb6ec-d022-43e4-92ee-b5aa59710619-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:12:21.555274 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.555239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" event={"ID":"7a8bb6ec-d022-43e4-92ee-b5aa59710619","Type":"ContainerDied","Data":"1889b55b7a3f85689d127001af5514ecb63ec062efbb7b70e092ade84640e9b6"} Apr 16 20:12:21.555274 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.555257 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6" Apr 16 20:12:21.555762 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.555292 2569 scope.go:117] "RemoveContainer" containerID="39bea455ecf5bc23c57f1fbf82a5d16f35b653a6bb4a160b416b05ccd6564493" Apr 16 20:12:21.564617 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.564598 2569 scope.go:117] "RemoveContainer" containerID="f1b3fbe5bcb34bd052b4dfa7028ec1d690b8c22a2dc2a168cea68a909ecc46e1" Apr 16 20:12:21.572652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.572629 2569 scope.go:117] "RemoveContainer" containerID="e111d465d48f654f9449885111d2f1e5941da3afba3f07b73e4c5a4300a10efa" Apr 16 20:12:21.584577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.584554 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6"] Apr 16 20:12:21.588343 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:21.588320 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-694b44cgf6"] Apr 16 20:12:22.376241 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:22.374055 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" path="/var/lib/kubelet/pods/7a8bb6ec-d022-43e4-92ee-b5aa59710619/volumes" Apr 16 20:12:28.112181 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:28.112145 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:28.112649 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:28.112197 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:28.114956 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:28.114933 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:28.584984 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:28.584955 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:41.197038 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.196992 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54"] Apr 16 20:12:41.197501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197475 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerName="main" Apr 16 20:12:41.197501 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197497 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerName="main" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197513 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerName="storage-initializer" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197522 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerName="storage-initializer" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197541 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="tokenizer" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197549 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="tokenizer" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197567 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="main" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197575 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="main" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197587 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="storage-initializer" Apr 16 20:12:41.197639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197592 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="storage-initializer" Apr 16 20:12:41.197941 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197661 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="main" Apr 16 20:12:41.197941 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197670 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ea34e91-d306-41b1-a697-1c7c6df2a2dd" containerName="main" Apr 16 20:12:41.197941 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.197677 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a8bb6ec-d022-43e4-92ee-b5aa59710619" containerName="tokenizer" Apr 16 20:12:41.202828 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.202806 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.207683 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.207664 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 20:12:41.217687 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.217667 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54"] Apr 16 20:12:41.270133 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.270103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.270237 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.270138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sdj4\" (UniqueName: \"kubernetes.io/projected/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kube-api-access-4sdj4\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.270237 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.270162 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-home\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.270237 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.270178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-dshm\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.270237 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.270196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.270374 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.270249 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-model-cache\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371355 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371320 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371355 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sdj4\" (UniqueName: \"kubernetes.io/projected/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kube-api-access-4sdj4\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371543 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-home\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371543 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-dshm\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371543 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371428 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371543 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-model-cache\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-home\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.371872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-model-cache\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.372001 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.371903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.373652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.373633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-dshm\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.373924 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.373906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.379322 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.379300 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sdj4\" (UniqueName: \"kubernetes.io/projected/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kube-api-access-4sdj4\") pod \"scheduler-ha-replicas-test-kserve-78799b957f-w8s54\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.512858 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.512773 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:41.519177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.519143 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb"] Apr 16 20:12:41.524639 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.524618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.526979 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.526959 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-2n5s5\"" Apr 16 20:12:41.536888 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.536862 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb"] Apr 16 20:12:41.574109 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.573141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c86f60-b331-4276-b335-7841ba990022-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.574109 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.573197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.574109 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.573245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ndk\" (UniqueName: \"kubernetes.io/projected/a3c86f60-b331-4276-b335-7841ba990022-kube-api-access-z6ndk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.574109 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.573330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.574109 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.573360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.574109 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.573389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.667612 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.667579 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54"] Apr 16 20:12:41.668607 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:12:41.668583 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf19ac9_c0f0_4589_b86c_ca0efc7c77de.slice/crio-7f26450a574e36d35fa45b352aed2360f7606a9ea3c129da60f5c8bb663396b3 WatchSource:0}: Error finding container 7f26450a574e36d35fa45b352aed2360f7606a9ea3c129da60f5c8bb663396b3: Status 404 returned error can't find the container with id 7f26450a574e36d35fa45b352aed2360f7606a9ea3c129da60f5c8bb663396b3 Apr 16 20:12:41.673752 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.673732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.673858 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.673761 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.673858 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.673784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.673858 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.673828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c86f60-b331-4276-b335-7841ba990022-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.673858 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.673847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.674099 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.673884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ndk\" (UniqueName: \"kubernetes.io/projected/a3c86f60-b331-4276-b335-7841ba990022-kube-api-access-z6ndk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.674177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.674151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.674343 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.674311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.674434 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.674344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.674490 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.674463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.676756 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.676709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c86f60-b331-4276-b335-7841ba990022-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.681409 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.681385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ndk\" (UniqueName: \"kubernetes.io/projected/a3c86f60-b331-4276-b335-7841ba990022-kube-api-access-z6ndk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.848759 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.848672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:41.989893 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:41.983541 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb"] Apr 16 20:12:42.643413 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:42.643374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerStarted","Data":"f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d"} Apr 16 20:12:42.643413 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:42.643417 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerStarted","Data":"efe972d35d65b1aab093ac384223e171c1c0b7f980701761e81070e7f0fb3388"} Apr 16 20:12:42.644926 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:42.644897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" event={"ID":"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de","Type":"ContainerStarted","Data":"d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243"} Apr 16 20:12:42.645066 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:42.644930 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" event={"ID":"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de","Type":"ContainerStarted","Data":"7f26450a574e36d35fa45b352aed2360f7606a9ea3c129da60f5c8bb663396b3"} Apr 16 20:12:43.650524 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:43.650490 2569 generic.go:358] "Generic (PLEG): container finished" podID="a3c86f60-b331-4276-b335-7841ba990022" containerID="f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d" exitCode=0 Apr 16 20:12:43.650930 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:43.650577 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerDied","Data":"f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d"} Apr 16 20:12:44.657083 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:44.657035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerStarted","Data":"394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f"} Apr 16 20:12:44.657083 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:44.657072 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerStarted","Data":"afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3"} Apr 16 20:12:44.657774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:44.657201 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:44.679693 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:44.679633 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" podStartSLOduration=3.679614416 podStartE2EDuration="3.679614416s" podCreationTimestamp="2026-04-16 20:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:44.677923657 +0000 UTC m=+1128.878633760" watchObservedRunningTime="2026-04-16 20:12:44.679614416 +0000 UTC m=+1128.880324488" Apr 16 20:12:46.666608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:46.666572 2569 generic.go:358] "Generic (PLEG): container finished" podID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerID="d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243" exitCode=0 Apr 16 20:12:46.667056 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:46.666646 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" event={"ID":"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de","Type":"ContainerDied","Data":"d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243"} Apr 16 20:12:47.672862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:47.672827 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" event={"ID":"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de","Type":"ContainerStarted","Data":"4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d"} Apr 16 20:12:47.694865 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:47.694803 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" podStartSLOduration=6.69478299 podStartE2EDuration="6.69478299s" podCreationTimestamp="2026-04-16 20:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:47.690448937 +0000 UTC m=+1131.891159014" watchObservedRunningTime="2026-04-16 20:12:47.69478299 +0000 UTC m=+1131.895493081" Apr 16 20:12:49.593306 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:49.593257 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:12:51.512937 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.512907 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:51.513356 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.512989 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:51.525968 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.525937 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:51.700080 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.700049 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:12:51.849143 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.849057 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:51.849143 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.849103 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:51.851966 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:51.851940 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:12:52.694223 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:12:52.694197 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:13:13.700264 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:13.700231 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:13:14.702222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:14.702185 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54"] Apr 16 20:13:14.702619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:14.702471 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerName="main" containerID="cri-o://4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d" gracePeriod=30 Apr 16 20:13:14.711522 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:14.711495 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb"] Apr 16 20:13:14.711902 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:14.711877 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="main" containerID="cri-o://afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3" gracePeriod=30 Apr 16 20:13:14.711987 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:14.711917 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="tokenizer" containerID="cri-o://394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f" gracePeriod=30 Apr 16 20:13:14.966537 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:14.966514 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:13:15.087691 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.087608 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kserve-provision-location\") pod \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " Apr 16 20:13:15.087862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.087710 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-tls-certs\") pod \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " Apr 16 20:13:15.087862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.087758 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sdj4\" (UniqueName: \"kubernetes.io/projected/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kube-api-access-4sdj4\") pod \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " Apr 16 20:13:15.087862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.087795 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-home\") pod \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " Apr 16 20:13:15.087862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.087820 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-dshm\") pod \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " Apr 16 20:13:15.087862 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.087843 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-model-cache\") pod \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\" (UID: \"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de\") " Apr 16 20:13:15.088177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.088095 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-home" (OuterVolumeSpecName: "home") pod "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" (UID: "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:15.088245 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.088190 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-home\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.088317 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.088293 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-model-cache" (OuterVolumeSpecName: "model-cache") pod "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" (UID: "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:15.090105 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.090075 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" (UID: "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:15.090105 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.090100 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kube-api-access-4sdj4" (OuterVolumeSpecName: "kube-api-access-4sdj4") pod "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" (UID: "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de"). InnerVolumeSpecName "kube-api-access-4sdj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:15.090281 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.090227 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-dshm" (OuterVolumeSpecName: "dshm") pod "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" (UID: "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:15.143338 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.143289 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" (UID: "ebf19ac9-c0f0-4589-b86c-ca0efc7c77de"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:15.189055 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.189006 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4sdj4\" (UniqueName: \"kubernetes.io/projected/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kube-api-access-4sdj4\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.189055 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.189049 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-dshm\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.189269 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.189064 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-model-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.189269 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.189077 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.189269 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.189086 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.789072 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.789041 2569 generic.go:358] "Generic (PLEG): container finished" podID="a3c86f60-b331-4276-b335-7841ba990022" containerID="afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3" exitCode=0 Apr 16 20:13:15.789441 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.789121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerDied","Data":"afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3"} Apr 16 20:13:15.790563 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.790540 2569 generic.go:358] "Generic (PLEG): container finished" podID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerID="4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d" exitCode=0 Apr 16 20:13:15.790666 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.790616 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" Apr 16 20:13:15.790712 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.790618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" event={"ID":"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de","Type":"ContainerDied","Data":"4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d"} Apr 16 20:13:15.790749 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.790710 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54" event={"ID":"ebf19ac9-c0f0-4589-b86c-ca0efc7c77de","Type":"ContainerDied","Data":"7f26450a574e36d35fa45b352aed2360f7606a9ea3c129da60f5c8bb663396b3"} Apr 16 20:13:15.790749 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.790725 2569 scope.go:117] "RemoveContainer" containerID="4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d" Apr 16 20:13:15.824610 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.824593 2569 scope.go:117] "RemoveContainer" containerID="d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243" Apr 16 20:13:15.834704 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.834683 2569 scope.go:117] "RemoveContainer" containerID="4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d" Apr 16 20:13:15.834971 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:13:15.834942 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d\": container with ID starting with 4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d not found: ID does not exist" containerID="4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d" Apr 16 20:13:15.835090 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.834983 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d"} err="failed to get container status \"4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d\": rpc error: code = NotFound desc = could not find container \"4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d\": container with ID starting with 4c768ee39c173f237dc74300afaed1a895032b483b4bb4c9ec14ffa421098a7d not found: ID does not exist" Apr 16 20:13:15.835146 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.835126 2569 scope.go:117] "RemoveContainer" containerID="d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243" Apr 16 20:13:15.835397 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:13:15.835373 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243\": container with ID starting with d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243 not found: ID does not exist" containerID="d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243" Apr 16 20:13:15.835496 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.835403 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243"} err="failed to get container status \"d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243\": rpc error: code = NotFound desc = could not find container \"d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243\": container with ID starting with d74ebaba4c7447e8a3e7b805dbc3a9de6874c444b0dfefae6189166ed759f243 not found: ID does not exist" Apr 16 20:13:15.835905 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.835884 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54"] Apr 16 20:13:15.839257 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.839235 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-78799b957f-w8s54"] Apr 16 20:13:15.969756 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:15.969734 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:13:16.097349 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097253 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-kserve-provision-location\") pod \"a3c86f60-b331-4276-b335-7841ba990022\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " Apr 16 20:13:16.097349 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097302 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-uds\") pod \"a3c86f60-b331-4276-b335-7841ba990022\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " Apr 16 20:13:16.097349 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097349 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-tmp\") pod \"a3c86f60-b331-4276-b335-7841ba990022\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " Apr 16 20:13:16.097624 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097367 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-cache\") pod \"a3c86f60-b331-4276-b335-7841ba990022\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " Apr 16 20:13:16.097624 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097391 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6ndk\" (UniqueName: \"kubernetes.io/projected/a3c86f60-b331-4276-b335-7841ba990022-kube-api-access-z6ndk\") pod \"a3c86f60-b331-4276-b335-7841ba990022\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " Apr 16 20:13:16.097624 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097439 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c86f60-b331-4276-b335-7841ba990022-tls-certs\") pod \"a3c86f60-b331-4276-b335-7841ba990022\" (UID: \"a3c86f60-b331-4276-b335-7841ba990022\") " Apr 16 20:13:16.097790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097598 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a3c86f60-b331-4276-b335-7841ba990022" (UID: "a3c86f60-b331-4276-b335-7841ba990022"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:16.097790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097613 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a3c86f60-b331-4276-b335-7841ba990022" (UID: "a3c86f60-b331-4276-b335-7841ba990022"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:16.097790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097723 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a3c86f60-b331-4276-b335-7841ba990022" (UID: "a3c86f60-b331-4276-b335-7841ba990022"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:16.097790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097737 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.097790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097758 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.098004 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.097985 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3c86f60-b331-4276-b335-7841ba990022" (UID: "a3c86f60-b331-4276-b335-7841ba990022"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:16.099518 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.099490 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c86f60-b331-4276-b335-7841ba990022-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a3c86f60-b331-4276-b335-7841ba990022" (UID: "a3c86f60-b331-4276-b335-7841ba990022"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:16.099575 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.099558 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c86f60-b331-4276-b335-7841ba990022-kube-api-access-z6ndk" (OuterVolumeSpecName: "kube-api-access-z6ndk") pod "a3c86f60-b331-4276-b335-7841ba990022" (UID: "a3c86f60-b331-4276-b335-7841ba990022"). InnerVolumeSpecName "kube-api-access-z6ndk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:16.198919 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.198877 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.198919 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.198910 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3c86f60-b331-4276-b335-7841ba990022-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.198919 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.198924 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6ndk\" (UniqueName: \"kubernetes.io/projected/a3c86f60-b331-4276-b335-7841ba990022-kube-api-access-z6ndk\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.199199 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.198938 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c86f60-b331-4276-b335-7841ba990022-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.373842 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.373761 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" path="/var/lib/kubelet/pods/ebf19ac9-c0f0-4589-b86c-ca0efc7c77de/volumes" Apr 16 20:13:16.797134 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.797090 2569 generic.go:358] "Generic (PLEG): container finished" podID="a3c86f60-b331-4276-b335-7841ba990022" containerID="394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f" exitCode=0 Apr 16 20:13:16.797607 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.797154 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerDied","Data":"394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f"} Apr 16 20:13:16.797607 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.797195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" event={"ID":"a3c86f60-b331-4276-b335-7841ba990022","Type":"ContainerDied","Data":"efe972d35d65b1aab093ac384223e171c1c0b7f980701761e81070e7f0fb3388"} Apr 16 20:13:16.797607 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.797167 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb" Apr 16 20:13:16.797607 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.797213 2569 scope.go:117] "RemoveContainer" containerID="394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f" Apr 16 20:13:16.805792 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.805771 2569 scope.go:117] "RemoveContainer" containerID="afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3" Apr 16 20:13:16.813635 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.813613 2569 scope.go:117] "RemoveContainer" containerID="f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d" Apr 16 20:13:16.818230 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.818202 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb"] Apr 16 20:13:16.823222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.823199 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7qzmcb"] Apr 16 20:13:16.823798 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.823775 2569 scope.go:117] "RemoveContainer" containerID="394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f" Apr 16 20:13:16.824162 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:13:16.824131 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f\": container with ID starting with 394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f not found: ID does not exist" containerID="394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f" Apr 16 20:13:16.824218 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.824171 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f"} err="failed to get container status \"394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f\": rpc error: code = NotFound desc = could not find container \"394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f\": container with ID starting with 394098c793ad3f537fb1df6773b3a43f15ca1f09c2f2f46fab3c54bfb1e6710f not found: ID does not exist" Apr 16 20:13:16.824218 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.824192 2569 scope.go:117] "RemoveContainer" containerID="afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3" Apr 16 20:13:16.824439 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:13:16.824419 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3\": container with ID starting with afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3 not found: ID does not exist" containerID="afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3" Apr 16 20:13:16.824489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.824445 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3"} err="failed to get container status \"afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3\": rpc error: code = NotFound desc = could not find container \"afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3\": container with ID starting with afe3095784c713b09c70164b5de0636acd94d481964f1d7c48d53e4501a258a3 not found: ID does not exist" Apr 16 20:13:16.824489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.824466 2569 scope.go:117] "RemoveContainer" containerID="f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d" Apr 16 20:13:16.824678 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:13:16.824662 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d\": container with ID starting with f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d not found: ID does not exist" containerID="f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d" Apr 16 20:13:16.824719 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:16.824682 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d"} err="failed to get container status \"f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d\": rpc error: code = NotFound desc = could not find container \"f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d\": container with ID starting with f43d3733f69e1378c749e6c27a7fc334b444c408da52ace0428bc97edb99694d not found: ID does not exist" Apr 16 20:13:18.372069 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:18.372033 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c86f60-b331-4276-b335-7841ba990022" path="/var/lib/kubelet/pods/a3c86f60-b331-4276-b335-7841ba990022/volumes" Apr 16 20:13:30.818108 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818073 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k"] Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818439 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="storage-initializer" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818450 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="storage-initializer" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818463 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="main" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818469 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="main" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818476 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="tokenizer" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818482 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="tokenizer" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818493 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerName="main" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818499 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerName="main" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818509 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerName="storage-initializer" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818514 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerName="storage-initializer" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818576 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="main" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818585 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebf19ac9-c0f0-4589-b86c-ca0efc7c77de" containerName="main" Apr 16 20:13:30.818584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.818592 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3c86f60-b331-4276-b335-7841ba990022" containerName="tokenizer" Apr 16 20:13:30.821242 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.821224 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:30.824285 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.824262 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 20:13:30.833699 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.833675 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k"] Apr 16 20:13:30.933077 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.933043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-home\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:30.933288 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.933086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:30.933288 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.933108 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-dshm\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:30.933288 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.933194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkc6m\" (UniqueName: \"kubernetes.io/projected/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kube-api-access-fkc6m\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:30.933288 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.933243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-tls-certs\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:30.933288 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:30.933276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-model-cache\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.033827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.033788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-home\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034052 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.033839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034052 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.033857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-dshm\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034052 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.033891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkc6m\" (UniqueName: \"kubernetes.io/projected/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kube-api-access-fkc6m\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034052 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.033913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-tls-certs\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034052 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.033934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-model-cache\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034646 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.034567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-home\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034646 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.034625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-model-cache\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.034812 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.034693 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.036441 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.036416 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-dshm\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.036579 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.036559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-tls-certs\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.042725 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.042700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkc6m\" (UniqueName: \"kubernetes.io/projected/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kube-api-access-fkc6m\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-v5r9k\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.098348 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.098254 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x"] Apr 16 20:13:31.102372 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.102348 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.105406 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.105380 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-qgf2l\"" Apr 16 20:13:31.117409 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.117377 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x"] Apr 16 20:13:31.132210 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.132182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:31.134316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.134299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.134418 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.134359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.134484 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.134442 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.134545 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.134485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.134599 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.134575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mllgq\" (UniqueName: \"kubernetes.io/projected/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kube-api-access-mllgq\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.134656 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.134613 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236207 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236390 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236390 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236390 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236583 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236438 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mllgq\" (UniqueName: \"kubernetes.io/projected/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kube-api-access-mllgq\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236583 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236702 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236601 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236702 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236815 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.236924 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.236903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.239097 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.239069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.245281 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.245252 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mllgq\" (UniqueName: \"kubernetes.io/projected/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kube-api-access-mllgq\") pod \"precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.263490 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.263463 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k"] Apr 16 20:13:31.265950 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:13:31.265918 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea9ee45_249c_4b4a_abb8_eb6d8ae704dc.slice/crio-d9e138fb355f126756bc661dbff6460029149269d1b1d2192a655ac15cf5186e WatchSource:0}: Error finding container d9e138fb355f126756bc661dbff6460029149269d1b1d2192a655ac15cf5186e: Status 404 returned error can't find the container with id d9e138fb355f126756bc661dbff6460029149269d1b1d2192a655ac15cf5186e Apr 16 20:13:31.413176 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.413136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:31.541548 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.541521 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x"] Apr 16 20:13:31.543147 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:13:31.543116 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2da5bd_9a57_430a_bd75_88dce7df91b4.slice/crio-acc2a2ec8c1677afb6c7ebb8340681f6a7647b11c0708da94ef6f18aab5ba079 WatchSource:0}: Error finding container acc2a2ec8c1677afb6c7ebb8340681f6a7647b11c0708da94ef6f18aab5ba079: Status 404 returned error can't find the container with id acc2a2ec8c1677afb6c7ebb8340681f6a7647b11c0708da94ef6f18aab5ba079 Apr 16 20:13:31.865793 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.865698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerStarted","Data":"afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2"} Apr 16 20:13:31.865793 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.865773 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerStarted","Data":"acc2a2ec8c1677afb6c7ebb8340681f6a7647b11c0708da94ef6f18aab5ba079"} Apr 16 20:13:31.867391 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.867351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" event={"ID":"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc","Type":"ContainerStarted","Data":"c1311159d97c3d0b6f224d95b3ce02426275b472caff4c4c69018accfd3a6704"} Apr 16 20:13:31.867391 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:31.867392 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" event={"ID":"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc","Type":"ContainerStarted","Data":"d9e138fb355f126756bc661dbff6460029149269d1b1d2192a655ac15cf5186e"} Apr 16 20:13:32.873517 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:32.873479 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerID="afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2" exitCode=0 Apr 16 20:13:32.874006 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:32.873569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerDied","Data":"afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2"} Apr 16 20:13:33.880410 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:33.880358 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerStarted","Data":"865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381"} Apr 16 20:13:33.880804 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:33.880460 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerStarted","Data":"92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5"} Apr 16 20:13:33.880804 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:33.880502 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:33.903668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:33.903614 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" podStartSLOduration=2.9035975609999998 podStartE2EDuration="2.903597561s" podCreationTimestamp="2026-04-16 20:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:33.902186727 +0000 UTC m=+1178.102896807" watchObservedRunningTime="2026-04-16 20:13:33.903597561 +0000 UTC m=+1178.104307669" Apr 16 20:13:35.890544 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:35.890460 2569 generic.go:358] "Generic (PLEG): container finished" podID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerID="c1311159d97c3d0b6f224d95b3ce02426275b472caff4c4c69018accfd3a6704" exitCode=0 Apr 16 20:13:35.890933 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:35.890549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" event={"ID":"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc","Type":"ContainerDied","Data":"c1311159d97c3d0b6f224d95b3ce02426275b472caff4c4c69018accfd3a6704"} Apr 16 20:13:36.899198 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:36.899162 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" event={"ID":"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc","Type":"ContainerStarted","Data":"1de56de3afd8b97a7443d98fba7a47190e97033ef776a47e3c98f4d2ce243d23"} Apr 16 20:13:36.929257 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:36.929209 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" podStartSLOduration=6.929191657 podStartE2EDuration="6.929191657s" podCreationTimestamp="2026-04-16 20:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:36.927809887 +0000 UTC m=+1181.128519987" watchObservedRunningTime="2026-04-16 20:13:36.929191657 +0000 UTC m=+1181.129901736" Apr 16 20:13:41.132901 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.132864 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:41.133327 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.132971 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:41.145684 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.145651 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:41.413986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.413954 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:41.414259 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.414000 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:41.415395 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:13:41.415368 2569 logging.go:55] [core] [Channel #464 SubChannel #465]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.54:9003", ServerName: "10.134.0.54:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.54:9003: connect: connection refused" Apr 16 20:13:41.416668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.416646 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:41.923409 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.923381 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:13:41.933504 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:41.933482 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:13:42.414957 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:42.414916 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.54:9003\" within 1s: context deadline exceeded" Apr 16 20:13:51.414541 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:13:51.414509 2569 logging.go:55] [core] [Channel #472 SubChannel #473]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.54:9003", ServerName: "10.134.0.54:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.54:9003: connect: connection refused" Apr 16 20:13:52.414419 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:52.414376 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.54:9003\" within 1s: context deadline exceeded" Apr 16 20:13:56.356006 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:56.355967 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:13:56.361123 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:13:56.361097 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:14:02.927484 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:02.927450 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:14:03.871965 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:03.871932 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k"] Apr 16 20:14:03.872312 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:03.872261 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerName="main" containerID="cri-o://1de56de3afd8b97a7443d98fba7a47190e97033ef776a47e3c98f4d2ce243d23" gracePeriod=30 Apr 16 20:14:03.874468 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:03.874444 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x"] Apr 16 20:14:03.874774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:03.874720 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="main" containerID="cri-o://92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5" gracePeriod=30 Apr 16 20:14:03.874774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:03.874736 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="tokenizer" containerID="cri-o://865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381" gracePeriod=30 Apr 16 20:14:04.012843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.012811 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerID="92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5" exitCode=0 Apr 16 20:14:04.013262 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.012868 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerDied","Data":"92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5"} Apr 16 20:14:04.014787 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.014761 2569 generic.go:358] "Generic (PLEG): container finished" podID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerID="1de56de3afd8b97a7443d98fba7a47190e97033ef776a47e3c98f4d2ce243d23" exitCode=0 Apr 16 20:14:04.014920 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.014834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" event={"ID":"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc","Type":"ContainerDied","Data":"1de56de3afd8b97a7443d98fba7a47190e97033ef776a47e3c98f4d2ce243d23"} Apr 16 20:14:04.130694 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.130628 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:14:04.250129 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250079 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-model-cache\") pod \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " Apr 16 20:14:04.250318 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250188 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-dshm\") pod \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " Apr 16 20:14:04.250318 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250226 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-tls-certs\") pod \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " Apr 16 20:14:04.250318 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250283 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-home\") pod \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " Apr 16 20:14:04.250318 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250309 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kserve-provision-location\") pod \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " Apr 16 20:14:04.250545 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250339 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkc6m\" (UniqueName: \"kubernetes.io/projected/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kube-api-access-fkc6m\") pod \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\" (UID: \"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc\") " Apr 16 20:14:04.250545 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-model-cache" (OuterVolumeSpecName: "model-cache") pod "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" (UID: "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.250704 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.250673 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-home" (OuterVolumeSpecName: "home") pod "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" (UID: "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.251211 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.251160 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-home\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.251322 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.251214 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-model-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.252823 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.252795 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kube-api-access-fkc6m" (OuterVolumeSpecName: "kube-api-access-fkc6m") pod "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" (UID: "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc"). InnerVolumeSpecName "kube-api-access-fkc6m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:04.253124 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.253072 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" (UID: "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:04.253207 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.253181 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-dshm" (OuterVolumeSpecName: "dshm") pod "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" (UID: "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.307290 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.307226 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" (UID: "4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.352838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.352791 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-dshm\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.352838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.352836 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.353122 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.352851 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.353122 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:04.352865 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkc6m\" (UniqueName: \"kubernetes.io/projected/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc-kube-api-access-fkc6m\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:05.020538 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.020506 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" Apr 16 20:14:05.020942 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.020510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k" event={"ID":"4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc","Type":"ContainerDied","Data":"d9e138fb355f126756bc661dbff6460029149269d1b1d2192a655ac15cf5186e"} Apr 16 20:14:05.020942 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.020637 2569 scope.go:117] "RemoveContainer" containerID="1de56de3afd8b97a7443d98fba7a47190e97033ef776a47e3c98f4d2ce243d23" Apr 16 20:14:05.029558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.029534 2569 scope.go:117] "RemoveContainer" containerID="c1311159d97c3d0b6f224d95b3ce02426275b472caff4c4c69018accfd3a6704" Apr 16 20:14:05.040232 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.040198 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k"] Apr 16 20:14:05.046566 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.046539 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-v5r9k"] Apr 16 20:14:05.438068 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.438047 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:14:05.562546 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562450 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-cache\") pod \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " Apr 16 20:14:05.562546 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562506 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tls-certs\") pod \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " Apr 16 20:14:05.562546 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562539 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mllgq\" (UniqueName: \"kubernetes.io/projected/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kube-api-access-mllgq\") pod \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " Apr 16 20:14:05.562839 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562617 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-uds\") pod \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " Apr 16 20:14:05.562839 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562670 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kserve-provision-location\") pod \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " Apr 16 20:14:05.562839 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562700 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-tmp\") pod \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\" (UID: \"4f2da5bd-9a57-430a-bd75-88dce7df91b4\") " Apr 16 20:14:05.562986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562868 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4f2da5bd-9a57-430a-bd75-88dce7df91b4" (UID: "4f2da5bd-9a57-430a-bd75-88dce7df91b4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:05.562986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.562902 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4f2da5bd-9a57-430a-bd75-88dce7df91b4" (UID: "4f2da5bd-9a57-430a-bd75-88dce7df91b4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:05.563128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.563087 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:05.563128 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.563108 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:05.563238 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.563146 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4f2da5bd-9a57-430a-bd75-88dce7df91b4" (UID: "4f2da5bd-9a57-430a-bd75-88dce7df91b4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:05.563447 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.563426 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f2da5bd-9a57-430a-bd75-88dce7df91b4" (UID: "4f2da5bd-9a57-430a-bd75-88dce7df91b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:05.564669 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.564651 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4f2da5bd-9a57-430a-bd75-88dce7df91b4" (UID: "4f2da5bd-9a57-430a-bd75-88dce7df91b4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:05.564771 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.564697 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kube-api-access-mllgq" (OuterVolumeSpecName: "kube-api-access-mllgq") pod "4f2da5bd-9a57-430a-bd75-88dce7df91b4" (UID: "4f2da5bd-9a57-430a-bd75-88dce7df91b4"). InnerVolumeSpecName "kube-api-access-mllgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:05.663586 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.663547 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:05.663586 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.663580 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:05.663586 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.663591 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2da5bd-9a57-430a-bd75-88dce7df91b4-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:05.663822 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:05.663600 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mllgq\" (UniqueName: \"kubernetes.io/projected/4f2da5bd-9a57-430a-bd75-88dce7df91b4-kube-api-access-mllgq\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:06.025909 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.025872 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerID="865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381" exitCode=0 Apr 16 20:14:06.026401 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.025955 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" Apr 16 20:14:06.026401 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.025953 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerDied","Data":"865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381"} Apr 16 20:14:06.026401 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.025996 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x" event={"ID":"4f2da5bd-9a57-430a-bd75-88dce7df91b4","Type":"ContainerDied","Data":"acc2a2ec8c1677afb6c7ebb8340681f6a7647b11c0708da94ef6f18aab5ba079"} Apr 16 20:14:06.026401 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.026030 2569 scope.go:117] "RemoveContainer" containerID="865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381" Apr 16 20:14:06.034852 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.034833 2569 scope.go:117] "RemoveContainer" containerID="92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5" Apr 16 20:14:06.042875 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.042853 2569 scope.go:117] "RemoveContainer" containerID="afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2" Apr 16 20:14:06.051360 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.051334 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x"] Apr 16 20:14:06.052737 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.052715 2569 scope.go:117] "RemoveContainer" containerID="865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381" Apr 16 20:14:06.053050 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:14:06.053005 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381\": container with ID starting with 865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381 not found: ID does not exist" containerID="865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381" Apr 16 20:14:06.053121 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.053059 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381"} err="failed to get container status \"865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381\": rpc error: code = NotFound desc = could not find container \"865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381\": container with ID starting with 865f0943958c6f964c9d2c977bfd74c06e84cc87f1984f7271d169bcec3e3381 not found: ID does not exist" Apr 16 20:14:06.053121 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.053079 2569 scope.go:117] "RemoveContainer" containerID="92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5" Apr 16 20:14:06.053309 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:14:06.053295 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5\": container with ID starting with 92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5 not found: ID does not exist" containerID="92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5" Apr 16 20:14:06.053351 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.053314 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5"} err="failed to get container status \"92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5\": rpc error: code = NotFound desc = could not find container \"92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5\": container with ID starting with 92684193182ed9dc7405c6c13c9349617f7ef13ccd8084ab08dea186a384fab5 not found: ID does not exist" Apr 16 20:14:06.053351 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.053327 2569 scope.go:117] "RemoveContainer" containerID="afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2" Apr 16 20:14:06.053555 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:14:06.053540 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2\": container with ID starting with afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2 not found: ID does not exist" containerID="afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2" Apr 16 20:14:06.053600 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.053558 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2"} err="failed to get container status \"afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2\": rpc error: code = NotFound desc = could not find container \"afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2\": container with ID starting with afd133c086995382158ce570249f94b5258f84d690eeca505c78d04d0535ada2 not found: ID does not exist" Apr 16 20:14:06.057647 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.057623 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-f44f6455crj6x"] Apr 16 20:14:06.377114 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.377033 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" path="/var/lib/kubelet/pods/4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc/volumes" Apr 16 20:14:06.377976 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:06.377946 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" path="/var/lib/kubelet/pods/4f2da5bd-9a57-430a-bd75-88dce7df91b4/volumes" Apr 16 20:14:14.067133 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067095 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv"] Apr 16 20:14:14.067706 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067691 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="main" Apr 16 20:14:14.067750 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067711 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="main" Apr 16 20:14:14.067750 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067727 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerName="main" Apr 16 20:14:14.067750 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067736 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerName="main" Apr 16 20:14:14.067843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067752 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="tokenizer" Apr 16 20:14:14.067843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067761 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="tokenizer" Apr 16 20:14:14.067843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067775 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerName="storage-initializer" Apr 16 20:14:14.067843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067785 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerName="storage-initializer" Apr 16 20:14:14.067843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067796 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="storage-initializer" Apr 16 20:14:14.067843 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067804 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="storage-initializer" Apr 16 20:14:14.068041 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067894 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="tokenizer" Apr 16 20:14:14.068041 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067907 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f2da5bd-9a57-430a-bd75-88dce7df91b4" containerName="main" Apr 16 20:14:14.068041 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.067917 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ea9ee45-249c-4b4a-abb8-eb6d8ae704dc" containerName="main" Apr 16 20:14:14.073361 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.073338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.078627 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.078595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-lktkr\"" Apr 16 20:14:14.078917 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.078899 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 20:14:14.084370 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.084345 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv"] Apr 16 20:14:14.131561 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.131523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.131727 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.131582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.131727 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.131640 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d586e6f7-4927-451f-9ea6-c80efb3ef358-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.131727 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.131672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggnkv\" (UniqueName: \"kubernetes.io/projected/d586e6f7-4927-451f-9ea6-c80efb3ef358-kube-api-access-ggnkv\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.131727 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.131699 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.131872 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.131730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.232695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.232656 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.232695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.232699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d586e6f7-4927-451f-9ea6-c80efb3ef358-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.232929 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.232717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggnkv\" (UniqueName: \"kubernetes.io/projected/d586e6f7-4927-451f-9ea6-c80efb3ef358-kube-api-access-ggnkv\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.232929 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.232849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.233059 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.232984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.233125 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.233101 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.233125 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.233108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.233295 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.233226 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.233295 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.233283 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.233368 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.233336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.235174 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.235154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d586e6f7-4927-451f-9ea6-c80efb3ef358-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.241955 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.241924 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggnkv\" (UniqueName: \"kubernetes.io/projected/d586e6f7-4927-451f-9ea6-c80efb3ef358-kube-api-access-ggnkv\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.384692 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.384607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:14.511293 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:14.511267 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv"] Apr 16 20:14:14.512957 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:14:14.512932 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd586e6f7_4927_451f_9ea6_c80efb3ef358.slice/crio-23d66c7364141a8937e995689396ce94b00fa64ce8be4d764a6ea666fd25ce51 WatchSource:0}: Error finding container 23d66c7364141a8937e995689396ce94b00fa64ce8be4d764a6ea666fd25ce51: Status 404 returned error can't find the container with id 23d66c7364141a8937e995689396ce94b00fa64ce8be4d764a6ea666fd25ce51 Apr 16 20:14:15.069156 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:15.069120 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerStarted","Data":"70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb"} Apr 16 20:14:15.069156 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:15.069158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerStarted","Data":"23d66c7364141a8937e995689396ce94b00fa64ce8be4d764a6ea666fd25ce51"} Apr 16 20:14:16.074306 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:16.074267 2569 generic.go:358] "Generic (PLEG): container finished" podID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerID="70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb" exitCode=0 Apr 16 20:14:16.074741 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:16.074354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerDied","Data":"70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb"} Apr 16 20:14:17.080845 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:17.080810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerStarted","Data":"a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157"} Apr 16 20:14:17.081249 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:17.080856 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerStarted","Data":"5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2"} Apr 16 20:14:17.081249 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:17.080980 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:17.109971 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:17.109910 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" podStartSLOduration=3.109892275 podStartE2EDuration="3.109892275s" podCreationTimestamp="2026-04-16 20:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:17.10620689 +0000 UTC m=+1221.306916969" watchObservedRunningTime="2026-04-16 20:14:17.109892275 +0000 UTC m=+1221.310602354" Apr 16 20:14:23.091703 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:23.091657 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp"] Apr 16 20:14:23.092137 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:23.092003 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="main" containerID="cri-o://fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c" gracePeriod=30 Apr 16 20:14:23.092137 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:23.092081 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="tokenizer" containerID="cri-o://27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab" gracePeriod=30 Apr 16 20:14:24.119517 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.119483 2569 generic.go:358] "Generic (PLEG): container finished" podID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerID="fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c" exitCode=0 Apr 16 20:14:24.119989 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.119551 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerDied","Data":"fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c"} Apr 16 20:14:24.360471 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.360449 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:14:24.385161 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.385082 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:24.385276 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.385185 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:24.388735 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.388711 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:24.429979 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.429949 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kserve-provision-location\") pod \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " Apr 16 20:14:24.430146 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.429998 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tls-certs\") pod \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " Apr 16 20:14:24.430146 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430044 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-uds\") pod \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " Apr 16 20:14:24.430235 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430207 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-tmp\") pod \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " Apr 16 20:14:24.430297 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430279 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b0705857-9508-4fa5-9ae2-8d6dbffa13ea" (UID: "b0705857-9508-4fa5-9ae2-8d6dbffa13ea"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:24.430363 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430285 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-cache\") pod \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " Apr 16 20:14:24.430363 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430330 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdgj7\" (UniqueName: \"kubernetes.io/projected/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kube-api-access-bdgj7\") pod \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\" (UID: \"b0705857-9508-4fa5-9ae2-8d6dbffa13ea\") " Apr 16 20:14:24.430489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430458 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b0705857-9508-4fa5-9ae2-8d6dbffa13ea" (UID: "b0705857-9508-4fa5-9ae2-8d6dbffa13ea"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:24.430581 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430561 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b0705857-9508-4fa5-9ae2-8d6dbffa13ea" (UID: "b0705857-9508-4fa5-9ae2-8d6dbffa13ea"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:24.430740 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430724 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:24.430798 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430746 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:24.430798 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430761 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:24.430963 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.430933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0705857-9508-4fa5-9ae2-8d6dbffa13ea" (UID: "b0705857-9508-4fa5-9ae2-8d6dbffa13ea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:24.432364 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.432340 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b0705857-9508-4fa5-9ae2-8d6dbffa13ea" (UID: "b0705857-9508-4fa5-9ae2-8d6dbffa13ea"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:24.432444 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.432384 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kube-api-access-bdgj7" (OuterVolumeSpecName: "kube-api-access-bdgj7") pod "b0705857-9508-4fa5-9ae2-8d6dbffa13ea" (UID: "b0705857-9508-4fa5-9ae2-8d6dbffa13ea"). InnerVolumeSpecName "kube-api-access-bdgj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:24.532212 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.532157 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bdgj7\" (UniqueName: \"kubernetes.io/projected/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kube-api-access-bdgj7\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:24.532212 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.532205 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:24.532212 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:24.532216 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0705857-9508-4fa5-9ae2-8d6dbffa13ea-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:14:25.125564 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.125532 2569 generic.go:358] "Generic (PLEG): container finished" podID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerID="27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab" exitCode=0 Apr 16 20:14:25.125991 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.125609 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" Apr 16 20:14:25.125991 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.125617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerDied","Data":"27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab"} Apr 16 20:14:25.125991 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.125659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp" event={"ID":"b0705857-9508-4fa5-9ae2-8d6dbffa13ea","Type":"ContainerDied","Data":"0b1804bdd418e1a9f3a501205fda15f465cdeb99e71caa34ff653bc5f21c843e"} Apr 16 20:14:25.125991 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.125678 2569 scope.go:117] "RemoveContainer" containerID="27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab" Apr 16 20:14:25.127412 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.127385 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:14:25.134935 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.134923 2569 scope.go:117] "RemoveContainer" containerID="fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c" Apr 16 20:14:25.142736 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.142717 2569 scope.go:117] "RemoveContainer" containerID="818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f" Apr 16 20:14:25.151101 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.151073 2569 scope.go:117] "RemoveContainer" containerID="27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab" Apr 16 20:14:25.151404 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:14:25.151383 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab\": container with ID starting with 27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab not found: ID does not exist" containerID="27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab" Apr 16 20:14:25.151481 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.151414 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab"} err="failed to get container status \"27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab\": rpc error: code = NotFound desc = could not find container \"27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab\": container with ID starting with 27932ae6ca71eee5f990bec293dc6ee6d251e68b31d7acbd29ae2d7f6a1985ab not found: ID does not exist" Apr 16 20:14:25.151481 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.151433 2569 scope.go:117] "RemoveContainer" containerID="fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c" Apr 16 20:14:25.151722 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:14:25.151689 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c\": container with ID starting with fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c not found: ID does not exist" containerID="fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c" Apr 16 20:14:25.151800 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.151730 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c"} err="failed to get container status \"fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c\": rpc error: code = NotFound desc = could not find container \"fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c\": container with ID starting with fe96683206caef629bbac41822f8ac710eb62d4338c61a50cff1246a706e947c not found: ID does not exist" Apr 16 20:14:25.151800 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.151747 2569 scope.go:117] "RemoveContainer" containerID="818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f" Apr 16 20:14:25.152024 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:14:25.151985 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f\": container with ID starting with 818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f not found: ID does not exist" containerID="818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f" Apr 16 20:14:25.152249 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.152220 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f"} err="failed to get container status \"818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f\": rpc error: code = NotFound desc = could not find container \"818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f\": container with ID starting with 818f2c8aa5a7218d90d624db7b05ecebebf08474b34b77d3936b7602eeb5723f not found: ID does not exist" Apr 16 20:14:25.162640 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.162613 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp"] Apr 16 20:14:25.167000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:25.166977 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5c9c487bd-kpdgp"] Apr 16 20:14:26.373177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:26.373140 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" path="/var/lib/kubelet/pods/b0705857-9508-4fa5-9ae2-8d6dbffa13ea/volumes" Apr 16 20:14:30.755671 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.755638 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w"] Apr 16 20:14:30.756275 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756257 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="tokenizer" Apr 16 20:14:30.756334 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756279 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="tokenizer" Apr 16 20:14:30.756334 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756310 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="main" Apr 16 20:14:30.756334 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756319 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="main" Apr 16 20:14:30.756430 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756334 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="storage-initializer" Apr 16 20:14:30.756430 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756345 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="storage-initializer" Apr 16 20:14:30.756430 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756425 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="main" Apr 16 20:14:30.756526 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.756437 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0705857-9508-4fa5-9ae2-8d6dbffa13ea" containerName="tokenizer" Apr 16 20:14:30.761918 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.761892 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.764494 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.764471 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 20:14:30.764617 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.764472 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-xv4qr\"" Apr 16 20:14:30.771191 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.771079 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w"] Apr 16 20:14:30.883816 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.883776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbl5d\" (UniqueName: \"kubernetes.io/projected/c247d0f0-c4f1-4731-b866-7b91723a1a54-kube-api-access-bbl5d\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.884118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.883831 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.884118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.883918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c247d0f0-c4f1-4731-b866-7b91723a1a54-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.884118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.883967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.884118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.884072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.884118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.884111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985286 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985286 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbl5d\" (UniqueName: \"kubernetes.io/projected/c247d0f0-c4f1-4731-b866-7b91723a1a54-kube-api-access-bbl5d\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c247d0f0-c4f1-4731-b866-7b91723a1a54-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985476 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985648 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985775 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985744 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.985912 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.985827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.987963 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.987939 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c247d0f0-c4f1-4731-b866-7b91723a1a54-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:30.993086 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:30.993054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbl5d\" (UniqueName: \"kubernetes.io/projected/c247d0f0-c4f1-4731-b866-7b91723a1a54-kube-api-access-bbl5d\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:31.073399 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:31.073303 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:31.207285 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:31.207253 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w"] Apr 16 20:14:31.208423 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:14:31.208390 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc247d0f0_c4f1_4731_b866_7b91723a1a54.slice/crio-4f05cb4a64a6476a1f6ac4cc7e7d1dc949a49c7201874d5d280ca1f256107e71 WatchSource:0}: Error finding container 4f05cb4a64a6476a1f6ac4cc7e7d1dc949a49c7201874d5d280ca1f256107e71: Status 404 returned error can't find the container with id 4f05cb4a64a6476a1f6ac4cc7e7d1dc949a49c7201874d5d280ca1f256107e71 Apr 16 20:14:32.157073 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:32.157038 2569 generic.go:358] "Generic (PLEG): container finished" podID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerID="38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a" exitCode=0 Apr 16 20:14:32.157488 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:32.157082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerDied","Data":"38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a"} Apr 16 20:14:32.157488 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:32.157106 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerStarted","Data":"4f05cb4a64a6476a1f6ac4cc7e7d1dc949a49c7201874d5d280ca1f256107e71"} Apr 16 20:14:33.163142 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:33.163110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerStarted","Data":"0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15"} Apr 16 20:14:33.163142 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:33.163145 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerStarted","Data":"7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb"} Apr 16 20:14:33.163578 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:33.163168 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:33.195221 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:33.195161 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" podStartSLOduration=3.195144241 podStartE2EDuration="3.195144241s" podCreationTimestamp="2026-04-16 20:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:33.19432796 +0000 UTC m=+1237.395038039" watchObservedRunningTime="2026-04-16 20:14:33.195144241 +0000 UTC m=+1237.395854346" Apr 16 20:14:41.073627 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:41.073573 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:41.074130 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:41.073738 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:41.076452 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:41.076428 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:41.205138 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:41.205112 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:14:47.137082 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:14:47.136976 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:15:03.214278 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:15:03.214244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:16:35.204653 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:35.204609 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv"] Apr 16 20:16:35.205168 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:35.205118 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="main" containerID="cri-o://5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2" gracePeriod=30 Apr 16 20:16:35.205319 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:35.205266 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="tokenizer" containerID="cri-o://a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157" gracePeriod=30 Apr 16 20:16:35.644668 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:35.644572 2569 generic.go:358] "Generic (PLEG): container finished" podID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerID="5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2" exitCode=0 Apr 16 20:16:35.644813 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:35.644655 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerDied","Data":"5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2"} Apr 16 20:16:36.457136 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.457107 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:16:36.555809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.555734 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggnkv\" (UniqueName: \"kubernetes.io/projected/d586e6f7-4927-451f-9ea6-c80efb3ef358-kube-api-access-ggnkv\") pod \"d586e6f7-4927-451f-9ea6-c80efb3ef358\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " Apr 16 20:16:36.555809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.555780 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-cache\") pod \"d586e6f7-4927-451f-9ea6-c80efb3ef358\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " Apr 16 20:16:36.555809 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.555801 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d586e6f7-4927-451f-9ea6-c80efb3ef358-tls-certs\") pod \"d586e6f7-4927-451f-9ea6-c80efb3ef358\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " Apr 16 20:16:36.556153 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.555822 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-uds\") pod \"d586e6f7-4927-451f-9ea6-c80efb3ef358\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " Apr 16 20:16:36.556153 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.555856 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-tmp\") pod \"d586e6f7-4927-451f-9ea6-c80efb3ef358\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " Apr 16 20:16:36.556153 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.555894 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-kserve-provision-location\") pod \"d586e6f7-4927-451f-9ea6-c80efb3ef358\" (UID: \"d586e6f7-4927-451f-9ea6-c80efb3ef358\") " Apr 16 20:16:36.556153 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.556079 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d586e6f7-4927-451f-9ea6-c80efb3ef358" (UID: "d586e6f7-4927-451f-9ea6-c80efb3ef358"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:36.556324 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.556170 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d586e6f7-4927-451f-9ea6-c80efb3ef358" (UID: "d586e6f7-4927-451f-9ea6-c80efb3ef358"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:36.556324 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.556236 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d586e6f7-4927-451f-9ea6-c80efb3ef358" (UID: "d586e6f7-4927-451f-9ea6-c80efb3ef358"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:36.556757 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.556733 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d586e6f7-4927-451f-9ea6-c80efb3ef358" (UID: "d586e6f7-4927-451f-9ea6-c80efb3ef358"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:36.558034 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.558005 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d586e6f7-4927-451f-9ea6-c80efb3ef358-kube-api-access-ggnkv" (OuterVolumeSpecName: "kube-api-access-ggnkv") pod "d586e6f7-4927-451f-9ea6-c80efb3ef358" (UID: "d586e6f7-4927-451f-9ea6-c80efb3ef358"). InnerVolumeSpecName "kube-api-access-ggnkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:36.558106 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.558078 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d586e6f7-4927-451f-9ea6-c80efb3ef358-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d586e6f7-4927-451f-9ea6-c80efb3ef358" (UID: "d586e6f7-4927-451f-9ea6-c80efb3ef358"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:36.650642 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.650603 2569 generic.go:358] "Generic (PLEG): container finished" podID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerID="a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157" exitCode=0 Apr 16 20:16:36.650831 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.650691 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" Apr 16 20:16:36.650831 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.650692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerDied","Data":"a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157"} Apr 16 20:16:36.650831 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.650820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv" event={"ID":"d586e6f7-4927-451f-9ea6-c80efb3ef358","Type":"ContainerDied","Data":"23d66c7364141a8937e995689396ce94b00fa64ce8be4d764a6ea666fd25ce51"} Apr 16 20:16:36.650984 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.650844 2569 scope.go:117] "RemoveContainer" containerID="a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157" Apr 16 20:16:36.656832 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.656802 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:36.656999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.656836 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:36.656999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.656851 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:36.656999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.656867 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggnkv\" (UniqueName: \"kubernetes.io/projected/d586e6f7-4927-451f-9ea6-c80efb3ef358-kube-api-access-ggnkv\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:36.656999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.656882 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d586e6f7-4927-451f-9ea6-c80efb3ef358-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:36.656999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.656897 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d586e6f7-4927-451f-9ea6-c80efb3ef358-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:36.660177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.660161 2569 scope.go:117] "RemoveContainer" containerID="5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2" Apr 16 20:16:36.668586 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.668566 2569 scope.go:117] "RemoveContainer" containerID="70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb" Apr 16 20:16:36.674586 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.674563 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv"] Apr 16 20:16:36.678494 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.678466 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-5czsv"] Apr 16 20:16:36.678812 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.678795 2569 scope.go:117] "RemoveContainer" containerID="a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157" Apr 16 20:16:36.679096 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:16:36.679076 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157\": container with ID starting with a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157 not found: ID does not exist" containerID="a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157" Apr 16 20:16:36.679188 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.679111 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157"} err="failed to get container status \"a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157\": rpc error: code = NotFound desc = could not find container \"a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157\": container with ID starting with a5afe94a9615f64822ef2c955a3afc6f28ee62f7da0b249132ae172765d26157 not found: ID does not exist" Apr 16 20:16:36.679188 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.679139 2569 scope.go:117] "RemoveContainer" containerID="5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2" Apr 16 20:16:36.679416 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:16:36.679396 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2\": container with ID starting with 5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2 not found: ID does not exist" containerID="5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2" Apr 16 20:16:36.679455 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.679422 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2"} err="failed to get container status \"5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2\": rpc error: code = NotFound desc = could not find container \"5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2\": container with ID starting with 5fd8bfc419459c93e3775c1ce7ea48c6bb43944d27b891f1d59c369e7527bfe2 not found: ID does not exist" Apr 16 20:16:36.679455 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.679439 2569 scope.go:117] "RemoveContainer" containerID="70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb" Apr 16 20:16:36.679664 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:16:36.679647 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb\": container with ID starting with 70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb not found: ID does not exist" containerID="70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb" Apr 16 20:16:36.679723 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:36.679673 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb"} err="failed to get container status \"70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb\": rpc error: code = NotFound desc = could not find container \"70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb\": container with ID starting with 70a33d6e15bfea18f10efa7bbf34b96d3e9184e8a4704656094760a225d296fb not found: ID does not exist" Apr 16 20:16:38.371683 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:38.371637 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" path="/var/lib/kubelet/pods/d586e6f7-4927-451f-9ea6-c80efb3ef358/volumes" Apr 16 20:16:59.035905 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.035870 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65"] Apr 16 20:16:59.036273 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036265 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="main" Apr 16 20:16:59.036316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036277 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="main" Apr 16 20:16:59.036316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036299 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="storage-initializer" Apr 16 20:16:59.036316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036305 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="storage-initializer" Apr 16 20:16:59.036316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036312 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="tokenizer" Apr 16 20:16:59.036316 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036318 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="tokenizer" Apr 16 20:16:59.036469 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036370 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="main" Apr 16 20:16:59.036469 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.036379 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d586e6f7-4927-451f-9ea6-c80efb3ef358" containerName="tokenizer" Apr 16 20:16:59.040833 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.040812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.043337 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.043312 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-vwc2n\"" Apr 16 20:16:59.043469 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.043358 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 20:16:59.050919 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.050893 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65"] Apr 16 20:16:59.154184 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.154143 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.154360 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.154192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.154360 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.154272 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.154472 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.154358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.154472 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.154396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.154472 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.154421 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv98b\" (UniqueName: \"kubernetes.io/projected/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kube-api-access-wv98b\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.255669 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.255478 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wv98b\" (UniqueName: \"kubernetes.io/projected/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kube-api-access-wv98b\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.255861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.255710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.255861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.255742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.255861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.255802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.256060 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.255881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.256060 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.255917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.256267 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.256238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.256344 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.256250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.256550 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.256529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.256625 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.256535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.258377 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.258357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.264660 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.264641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv98b\" (UniqueName: \"kubernetes.io/projected/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kube-api-access-wv98b\") pod \"stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.352001 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.351900 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:16:59.479419 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.479390 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65"] Apr 16 20:16:59.480206 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:16:59.480173 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5372ab0_f169_422f_a13a_43e9ff90ccc0.slice/crio-0721a549ab71bba0db840a5a7026048f273fd9d42a16daba7a7ab74867af579a WatchSource:0}: Error finding container 0721a549ab71bba0db840a5a7026048f273fd9d42a16daba7a7ab74867af579a: Status 404 returned error can't find the container with id 0721a549ab71bba0db840a5a7026048f273fd9d42a16daba7a7ab74867af579a Apr 16 20:16:59.482546 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.482529 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:16:59.740283 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.740239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerStarted","Data":"ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804"} Apr 16 20:16:59.740283 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:16:59.740289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerStarted","Data":"0721a549ab71bba0db840a5a7026048f273fd9d42a16daba7a7ab74867af579a"} Apr 16 20:17:00.746303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:00.746263 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerID="ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804" exitCode=0 Apr 16 20:17:00.746790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:00.746356 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerDied","Data":"ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804"} Apr 16 20:17:01.753436 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:01.753406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerStarted","Data":"75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e"} Apr 16 20:17:01.753436 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:01.753440 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerStarted","Data":"30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a"} Apr 16 20:17:01.753904 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:01.753506 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:17:01.775998 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:01.775944 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" podStartSLOduration=2.77592772 podStartE2EDuration="2.77592772s" podCreationTimestamp="2026-04-16 20:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:17:01.772741891 +0000 UTC m=+1385.973451969" watchObservedRunningTime="2026-04-16 20:17:01.77592772 +0000 UTC m=+1385.976637797" Apr 16 20:17:09.352715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:09.352677 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:17:09.353306 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:09.352728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:17:09.355503 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:09.355481 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:17:09.786156 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:09.786129 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:17:30.790914 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:30.790885 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:17:32.307271 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:32.307239 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w"] Apr 16 20:17:32.307744 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:32.307571 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="main" containerID="cri-o://7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb" gracePeriod=30 Apr 16 20:17:32.307744 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:32.307618 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="tokenizer" containerID="cri-o://0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15" gracePeriod=30 Apr 16 20:17:32.879613 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:32.879578 2569 generic.go:358] "Generic (PLEG): container finished" podID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerID="7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb" exitCode=0 Apr 16 20:17:32.879797 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:32.879647 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerDied","Data":"7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb"} Apr 16 20:17:33.213094 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:17:33.213066 2569 logging.go:55] [core] [Channel #684 SubChannel #685]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.56:9003", ServerName: "10.134.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.56:9003: connect: connection refused" Apr 16 20:17:33.565140 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.565116 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:17:33.662860 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.662819 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-kserve-provision-location\") pod \"c247d0f0-c4f1-4731-b866-7b91723a1a54\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " Apr 16 20:17:33.662860 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.662868 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbl5d\" (UniqueName: \"kubernetes.io/projected/c247d0f0-c4f1-4731-b866-7b91723a1a54-kube-api-access-bbl5d\") pod \"c247d0f0-c4f1-4731-b866-7b91723a1a54\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " Apr 16 20:17:33.663159 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.662891 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c247d0f0-c4f1-4731-b866-7b91723a1a54-tls-certs\") pod \"c247d0f0-c4f1-4731-b866-7b91723a1a54\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " Apr 16 20:17:33.663159 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.662931 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-cache\") pod \"c247d0f0-c4f1-4731-b866-7b91723a1a54\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " Apr 16 20:17:33.663159 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.662969 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-uds\") pod \"c247d0f0-c4f1-4731-b866-7b91723a1a54\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " Apr 16 20:17:33.663159 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663044 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-tmp\") pod \"c247d0f0-c4f1-4731-b866-7b91723a1a54\" (UID: \"c247d0f0-c4f1-4731-b866-7b91723a1a54\") " Apr 16 20:17:33.663378 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663259 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c247d0f0-c4f1-4731-b866-7b91723a1a54" (UID: "c247d0f0-c4f1-4731-b866-7b91723a1a54"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:33.663378 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663279 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c247d0f0-c4f1-4731-b866-7b91723a1a54" (UID: "c247d0f0-c4f1-4731-b866-7b91723a1a54"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:33.663489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663402 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:17:33.663489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663417 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:17:33.663489 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663414 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c247d0f0-c4f1-4731-b866-7b91723a1a54" (UID: "c247d0f0-c4f1-4731-b866-7b91723a1a54"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:33.663754 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.663730 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c247d0f0-c4f1-4731-b866-7b91723a1a54" (UID: "c247d0f0-c4f1-4731-b866-7b91723a1a54"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:33.665093 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.665070 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c247d0f0-c4f1-4731-b866-7b91723a1a54-kube-api-access-bbl5d" (OuterVolumeSpecName: "kube-api-access-bbl5d") pod "c247d0f0-c4f1-4731-b866-7b91723a1a54" (UID: "c247d0f0-c4f1-4731-b866-7b91723a1a54"). InnerVolumeSpecName "kube-api-access-bbl5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:17:33.665171 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.665140 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c247d0f0-c4f1-4731-b866-7b91723a1a54-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c247d0f0-c4f1-4731-b866-7b91723a1a54" (UID: "c247d0f0-c4f1-4731-b866-7b91723a1a54"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:17:33.764827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.764742 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:17:33.764827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.764773 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c247d0f0-c4f1-4731-b866-7b91723a1a54-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:17:33.764827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.764784 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbl5d\" (UniqueName: \"kubernetes.io/projected/c247d0f0-c4f1-4731-b866-7b91723a1a54-kube-api-access-bbl5d\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:17:33.764827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.764793 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c247d0f0-c4f1-4731-b866-7b91723a1a54-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:17:33.885307 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.885275 2569 generic.go:358] "Generic (PLEG): container finished" podID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerID="0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15" exitCode=0 Apr 16 20:17:33.885479 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.885350 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" Apr 16 20:17:33.885479 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.885363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerDied","Data":"0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15"} Apr 16 20:17:33.885479 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.885408 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" event={"ID":"c247d0f0-c4f1-4731-b866-7b91723a1a54","Type":"ContainerDied","Data":"4f05cb4a64a6476a1f6ac4cc7e7d1dc949a49c7201874d5d280ca1f256107e71"} Apr 16 20:17:33.885479 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.885425 2569 scope.go:117] "RemoveContainer" containerID="0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15" Apr 16 20:17:33.894074 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.894059 2569 scope.go:117] "RemoveContainer" containerID="7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb" Apr 16 20:17:33.901629 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.901610 2569 scope.go:117] "RemoveContainer" containerID="38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a" Apr 16 20:17:33.907940 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.907918 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w"] Apr 16 20:17:33.910583 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.910560 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w"] Apr 16 20:17:33.910690 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.910562 2569 scope.go:117] "RemoveContainer" containerID="0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15" Apr 16 20:17:33.910835 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:17:33.910819 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15\": container with ID starting with 0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15 not found: ID does not exist" containerID="0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15" Apr 16 20:17:33.910884 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.910841 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15"} err="failed to get container status \"0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15\": rpc error: code = NotFound desc = could not find container \"0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15\": container with ID starting with 0ea678847683ba64cdd6fb4c34c8f2ae72aaea76858e17a269f9d44b55568a15 not found: ID does not exist" Apr 16 20:17:33.910884 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.910857 2569 scope.go:117] "RemoveContainer" containerID="7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb" Apr 16 20:17:33.911118 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:17:33.911087 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb\": container with ID starting with 7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb not found: ID does not exist" containerID="7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb" Apr 16 20:17:33.911185 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.911124 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb"} err="failed to get container status \"7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb\": rpc error: code = NotFound desc = could not find container \"7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb\": container with ID starting with 7717bdc3fb575c022242097b7e3408544df68e9e5387db789348c38f34d262eb not found: ID does not exist" Apr 16 20:17:33.911185 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.911140 2569 scope.go:117] "RemoveContainer" containerID="38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a" Apr 16 20:17:33.911357 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:17:33.911336 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a\": container with ID starting with 38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a not found: ID does not exist" containerID="38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a" Apr 16 20:17:33.911405 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:33.911359 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a"} err="failed to get container status \"38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a\": rpc error: code = NotFound desc = could not find container \"38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a\": container with ID starting with 38e3577f5466110c91c183551f13a4f8e548122dc4bf7c3d056a70552092028a not found: ID does not exist" Apr 16 20:17:34.213805 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:34.213757 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schevm66w" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.56:9003\" within 1s: context deadline exceeded" Apr 16 20:17:34.375191 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:34.375155 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" path="/var/lib/kubelet/pods/c247d0f0-c4f1-4731-b866-7b91723a1a54/volumes" Apr 16 20:17:35.815817 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.815779 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp"] Apr 16 20:17:35.816258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816242 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="main" Apr 16 20:17:35.816305 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816260 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="main" Apr 16 20:17:35.816305 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816268 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="tokenizer" Apr 16 20:17:35.816305 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816274 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="tokenizer" Apr 16 20:17:35.816305 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816284 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="storage-initializer" Apr 16 20:17:35.816305 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816291 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="storage-initializer" Apr 16 20:17:35.816455 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816353 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="main" Apr 16 20:17:35.816455 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.816363 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c247d0f0-c4f1-4731-b866-7b91723a1a54" containerName="tokenizer" Apr 16 20:17:35.821417 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.821394 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.824537 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.824514 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-p8rsl\"" Apr 16 20:17:35.824954 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.824937 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 20:17:35.835035 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.834997 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp"] Apr 16 20:17:35.887317 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.887265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgj2\" (UniqueName: \"kubernetes.io/projected/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kube-api-access-2sgj2\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.887317 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.887321 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.887574 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.887438 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.887574 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.887488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.887574 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.887546 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.887675 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.887588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.988726 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.988692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgj2\" (UniqueName: \"kubernetes.io/projected/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kube-api-access-2sgj2\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.988974 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.988732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.988974 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.988788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.988974 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.988823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.988974 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.988859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.988974 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.988895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.989303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.989203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.989303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.989269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.989303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.989269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.989411 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.989319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.991628 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.991602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:35.996459 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:35.996433 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgj2\" (UniqueName: \"kubernetes.io/projected/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kube-api-access-2sgj2\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:36.130960 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:36.130877 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:36.256108 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:36.256073 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp"] Apr 16 20:17:36.256996 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:17:36.256967 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131772a0_17e1_452b_a2e6_f3c19ae9a8f0.slice/crio-95598c565a7f4ed54cb0ff2134dbbb1b3693b1678cffef95784a72a636b72b74 WatchSource:0}: Error finding container 95598c565a7f4ed54cb0ff2134dbbb1b3693b1678cffef95784a72a636b72b74: Status 404 returned error can't find the container with id 95598c565a7f4ed54cb0ff2134dbbb1b3693b1678cffef95784a72a636b72b74 Apr 16 20:17:36.899825 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:36.899787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerStarted","Data":"f22bfd9767bd095f3901750e9e1727c91792052adc73f220ca3574506fe1aadf"} Apr 16 20:17:36.899825 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:36.899825 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerStarted","Data":"95598c565a7f4ed54cb0ff2134dbbb1b3693b1678cffef95784a72a636b72b74"} Apr 16 20:17:37.905318 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:37.905284 2569 generic.go:358] "Generic (PLEG): container finished" podID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerID="f22bfd9767bd095f3901750e9e1727c91792052adc73f220ca3574506fe1aadf" exitCode=0 Apr 16 20:17:37.905738 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:37.905329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerDied","Data":"f22bfd9767bd095f3901750e9e1727c91792052adc73f220ca3574506fe1aadf"} Apr 16 20:17:38.912262 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:38.912225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerStarted","Data":"19a1c40fb9ff9281b7f8aa2c0c901c571ed403232ea22c4658608e07ef5ef2d6"} Apr 16 20:17:38.912262 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:38.912263 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerStarted","Data":"3ebf45669ab782556c7dcd5086330d2328ef6e777c7a07b5e973835e9795c186"} Apr 16 20:17:38.912685 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:38.912363 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:38.934519 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:38.934464 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" podStartSLOduration=3.934448348 podStartE2EDuration="3.934448348s" podCreationTimestamp="2026-04-16 20:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:17:38.93095954 +0000 UTC m=+1423.131669618" watchObservedRunningTime="2026-04-16 20:17:38.934448348 +0000 UTC m=+1423.135158426" Apr 16 20:17:46.131333 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:46.131238 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:46.131333 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:46.131299 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:46.134107 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:46.134074 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:17:46.945471 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:17:46.945439 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:18:07.951267 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:07.951233 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:18:50.047487 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:50.047444 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65"] Apr 16 20:18:50.048082 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:50.047894 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="main" containerID="cri-o://30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a" gracePeriod=30 Apr 16 20:18:50.048082 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:50.047985 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="tokenizer" containerID="cri-o://75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e" gracePeriod=30 Apr 16 20:18:50.200413 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:50.200373 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerID="30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a" exitCode=0 Apr 16 20:18:50.200596 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:50.200439 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerDied","Data":"30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a"} Apr 16 20:18:50.789393 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:18:50.789360 2569 logging.go:55] [core] [Channel #767 SubChannel #768]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.57:9003", ServerName: "10.134.0.57:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.57:9003: connect: connection refused" Apr 16 20:18:51.197381 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.197360 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:18:51.205994 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.205970 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerID="75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e" exitCode=0 Apr 16 20:18:51.206141 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.206052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerDied","Data":"75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e"} Apr 16 20:18:51.206141 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.206069 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" Apr 16 20:18:51.206141 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.206089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" event={"ID":"b5372ab0-f169-422f-a13a-43e9ff90ccc0","Type":"ContainerDied","Data":"0721a549ab71bba0db840a5a7026048f273fd9d42a16daba7a7ab74867af579a"} Apr 16 20:18:51.206141 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.206105 2569 scope.go:117] "RemoveContainer" containerID="75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e" Apr 16 20:18:51.214802 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.214787 2569 scope.go:117] "RemoveContainer" containerID="30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a" Apr 16 20:18:51.224184 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.224162 2569 scope.go:117] "RemoveContainer" containerID="ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804" Apr 16 20:18:51.235065 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.234982 2569 scope.go:117] "RemoveContainer" containerID="75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e" Apr 16 20:18:51.235308 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:18:51.235288 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e\": container with ID starting with 75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e not found: ID does not exist" containerID="75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e" Apr 16 20:18:51.235384 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.235315 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e"} err="failed to get container status \"75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e\": rpc error: code = NotFound desc = could not find container \"75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e\": container with ID starting with 75c5f418ff217e3609b9f02597d3f5403fb87c30b6c446710f770cb6e3aae01e not found: ID does not exist" Apr 16 20:18:51.235384 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.235332 2569 scope.go:117] "RemoveContainer" containerID="30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a" Apr 16 20:18:51.235585 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:18:51.235553 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a\": container with ID starting with 30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a not found: ID does not exist" containerID="30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a" Apr 16 20:18:51.235654 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.235581 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a"} err="failed to get container status \"30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a\": rpc error: code = NotFound desc = could not find container \"30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a\": container with ID starting with 30e1f7712325a776bc87ca8a2b31fcede259046613be9b14e231e8e0373d2e4a not found: ID does not exist" Apr 16 20:18:51.235654 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.235604 2569 scope.go:117] "RemoveContainer" containerID="ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804" Apr 16 20:18:51.235840 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:18:51.235821 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804\": container with ID starting with ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804 not found: ID does not exist" containerID="ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804" Apr 16 20:18:51.235907 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.235842 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804"} err="failed to get container status \"ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804\": rpc error: code = NotFound desc = could not find container \"ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804\": container with ID starting with ee1ab6dc5da90fee1b80517304fb4bdca9111c7a187c587d0769b5cbe07cc804 not found: ID does not exist" Apr 16 20:18:51.333117 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333063 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv98b\" (UniqueName: \"kubernetes.io/projected/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kube-api-access-wv98b\") pod \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " Apr 16 20:18:51.333117 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333098 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-cache\") pod \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " Apr 16 20:18:51.333258 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333241 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kserve-provision-location\") pod \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " Apr 16 20:18:51.333294 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333279 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-uds\") pod \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " Apr 16 20:18:51.333332 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333316 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-tmp\") pod \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " Apr 16 20:18:51.333366 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333340 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b5372ab0-f169-422f-a13a-43e9ff90ccc0" (UID: "b5372ab0-f169-422f-a13a-43e9ff90ccc0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:51.333366 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333358 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tls-certs\") pod \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\" (UID: \"b5372ab0-f169-422f-a13a-43e9ff90ccc0\") " Apr 16 20:18:51.333548 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333531 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b5372ab0-f169-422f-a13a-43e9ff90ccc0" (UID: "b5372ab0-f169-422f-a13a-43e9ff90ccc0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:51.333622 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333596 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:18:51.333622 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333613 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:18:51.333728 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b5372ab0-f169-422f-a13a-43e9ff90ccc0" (UID: "b5372ab0-f169-422f-a13a-43e9ff90ccc0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:51.333936 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.333918 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5372ab0-f169-422f-a13a-43e9ff90ccc0" (UID: "b5372ab0-f169-422f-a13a-43e9ff90ccc0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:51.335138 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.335107 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kube-api-access-wv98b" (OuterVolumeSpecName: "kube-api-access-wv98b") pod "b5372ab0-f169-422f-a13a-43e9ff90ccc0" (UID: "b5372ab0-f169-422f-a13a-43e9ff90ccc0"). InnerVolumeSpecName "kube-api-access-wv98b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:18:51.335372 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.335349 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b5372ab0-f169-422f-a13a-43e9ff90ccc0" (UID: "b5372ab0-f169-422f-a13a-43e9ff90ccc0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:18:51.434958 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.434924 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wv98b\" (UniqueName: \"kubernetes.io/projected/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kube-api-access-wv98b\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:18:51.434958 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.434956 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:18:51.434958 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.434967 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:18:51.435154 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.434979 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5372ab0-f169-422f-a13a-43e9ff90ccc0-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:18:51.532774 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.532739 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65"] Apr 16 20:18:51.535136 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.535108 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65"] Apr 16 20:18:51.789530 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:51.789484 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c7999b6fd-8ph65" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.57:9003\" within 1s: context deadline exceeded" Apr 16 20:18:52.372378 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:52.372344 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" path="/var/lib/kubelet/pods/b5372ab0-f169-422f-a13a-43e9ff90ccc0/volumes" Apr 16 20:18:56.390763 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:56.390731 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:18:56.396850 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:18:56.396824 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:19:27.311700 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.311660 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-64f548bc46-qnnrb"] Apr 16 20:19:27.312289 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.311923 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" podUID="2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" containerName="manager" containerID="cri-o://35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7" gracePeriod=30 Apr 16 20:19:27.555955 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.555930 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:19:27.631847 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.631761 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert\") pod \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " Apr 16 20:19:27.631847 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.631810 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7h6l\" (UniqueName: \"kubernetes.io/projected/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-kube-api-access-v7h6l\") pod \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\" (UID: \"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4\") " Apr 16 20:19:27.633908 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.633875 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert" (OuterVolumeSpecName: "cert") pod "2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" (UID: "2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:19:27.634036 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.633941 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-kube-api-access-v7h6l" (OuterVolumeSpecName: "kube-api-access-v7h6l") pod "2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" (UID: "2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4"). InnerVolumeSpecName "kube-api-access-v7h6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:27.732938 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.732905 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:19:27.732938 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:27.732932 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7h6l\" (UniqueName: \"kubernetes.io/projected/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4-kube-api-access-v7h6l\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:19:28.350894 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.350862 2569 generic.go:358] "Generic (PLEG): container finished" podID="2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" containerID="35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7" exitCode=0 Apr 16 20:19:28.351338 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.350921 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" Apr 16 20:19:28.351338 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.350952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" event={"ID":"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4","Type":"ContainerDied","Data":"35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7"} Apr 16 20:19:28.351338 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.350993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-64f548bc46-qnnrb" event={"ID":"2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4","Type":"ContainerDied","Data":"77fc53b80c13c04b127c40193f7cd1f912bbe024f45b3f0036b2285581930c30"} Apr 16 20:19:28.351338 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.351022 2569 scope.go:117] "RemoveContainer" containerID="35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7" Apr 16 20:19:28.360471 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.360454 2569 scope.go:117] "RemoveContainer" containerID="35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7" Apr 16 20:19:28.360716 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:19:28.360696 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7\": container with ID starting with 35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7 not found: ID does not exist" containerID="35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7" Apr 16 20:19:28.360767 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.360723 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7"} err="failed to get container status \"35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7\": rpc error: code = NotFound desc = could not find container \"35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7\": container with ID starting with 35ce0f7fed4599c45a69f32701f7247033ced41e23016d483740067fae8664a7 not found: ID does not exist" Apr 16 20:19:28.372491 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.372461 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-64f548bc46-qnnrb"] Apr 16 20:19:28.375468 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:28.375445 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-64f548bc46-qnnrb"] Apr 16 20:19:30.371754 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:19:30.371721 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" path="/var/lib/kubelet/pods/2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4/volumes" Apr 16 20:20:17.938246 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938208 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4"] Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938601 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" containerName="manager" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938613 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" containerName="manager" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938637 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="storage-initializer" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938643 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="storage-initializer" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938651 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="tokenizer" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938656 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="tokenizer" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938664 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="main" Apr 16 20:20:17.938695 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938669 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="main" Apr 16 20:20:17.938947 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938731 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cb60e7d-f6b6-49d5-b394-d8c5d9420bf4" containerName="manager" Apr 16 20:20:17.938947 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938741 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="tokenizer" Apr 16 20:20:17.938947 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.938749 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5372ab0-f169-422f-a13a-43e9ff90ccc0" containerName="main" Apr 16 20:20:17.942329 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.942303 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:17.948326 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.948301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 20:20:17.948453 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.948383 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-j6bmp\"" Apr 16 20:20:17.957991 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:17.957967 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4"] Apr 16 20:20:18.085900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.085867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.085900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.085907 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.086108 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.085936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.086108 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.086039 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97b1b693-df29-4d29-9033-6367f39f8952-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.086108 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.086098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84px2\" (UniqueName: \"kubernetes.io/projected/97b1b693-df29-4d29-9033-6367f39f8952-kube-api-access-84px2\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.086214 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.086132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187215 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187402 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187234 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187402 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187402 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97b1b693-df29-4d29-9033-6367f39f8952-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187402 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84px2\" (UniqueName: \"kubernetes.io/projected/97b1b693-df29-4d29-9033-6367f39f8952-kube-api-access-84px2\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187663 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187716 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187658 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187716 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.187811 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.187718 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.189949 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.189890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97b1b693-df29-4d29-9033-6367f39f8952-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.194728 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.194701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84px2\" (UniqueName: \"kubernetes.io/projected/97b1b693-df29-4d29-9033-6367f39f8952-kube-api-access-84px2\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.251961 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.251937 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:18.373226 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.373181 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4"] Apr 16 20:20:18.542846 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.542766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerStarted","Data":"aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942"} Apr 16 20:20:18.542846 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:18.542801 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerStarted","Data":"49d0c8da8ad8d2fa73d5a2fb9b08ee2a336a0eef42fbc1f02d7da9dfee89e764"} Apr 16 20:20:19.554876 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:19.554785 2569 generic.go:358] "Generic (PLEG): container finished" podID="97b1b693-df29-4d29-9033-6367f39f8952" containerID="aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942" exitCode=0 Apr 16 20:20:19.554876 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:19.554831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerDied","Data":"aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942"} Apr 16 20:20:20.560827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:20.560792 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerStarted","Data":"3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d"} Apr 16 20:20:20.560827 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:20.560829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerStarted","Data":"0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e"} Apr 16 20:20:20.561284 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:20.560941 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:20.585478 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:20.585421 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" podStartSLOduration=3.585407248 podStartE2EDuration="3.585407248s" podCreationTimestamp="2026-04-16 20:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:20.582608323 +0000 UTC m=+1584.783318427" watchObservedRunningTime="2026-04-16 20:20:20.585407248 +0000 UTC m=+1584.786117382" Apr 16 20:20:28.252573 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:28.252536 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:28.253095 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:28.252586 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:28.255439 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:28.255415 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:28.595949 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:28.595860 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:33.369849 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:33.369816 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp"] Apr 16 20:20:33.370253 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:33.370151 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="main" containerID="cri-o://3ebf45669ab782556c7dcd5086330d2328ef6e777c7a07b5e973835e9795c186" gracePeriod=30 Apr 16 20:20:33.370253 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:33.370218 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="tokenizer" containerID="cri-o://19a1c40fb9ff9281b7f8aa2c0c901c571ed403232ea22c4658608e07ef5ef2d6" gracePeriod=30 Apr 16 20:20:33.615873 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:33.615837 2569 generic.go:358] "Generic (PLEG): container finished" podID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerID="3ebf45669ab782556c7dcd5086330d2328ef6e777c7a07b5e973835e9795c186" exitCode=0 Apr 16 20:20:33.616096 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:33.615904 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerDied","Data":"3ebf45669ab782556c7dcd5086330d2328ef6e777c7a07b5e973835e9795c186"} Apr 16 20:20:34.623169 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.623127 2569 generic.go:358] "Generic (PLEG): container finished" podID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerID="19a1c40fb9ff9281b7f8aa2c0c901c571ed403232ea22c4658608e07ef5ef2d6" exitCode=0 Apr 16 20:20:34.623169 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.623153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerDied","Data":"19a1c40fb9ff9281b7f8aa2c0c901c571ed403232ea22c4658608e07ef5ef2d6"} Apr 16 20:20:34.642923 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.642892 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:20:34.839857 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.839755 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kserve-provision-location\") pod \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " Apr 16 20:20:34.839857 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.839815 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tls-certs\") pod \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " Apr 16 20:20:34.839857 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.839838 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-cache\") pod \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " Apr 16 20:20:34.839857 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.839857 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-uds\") pod \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " Apr 16 20:20:34.840247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.839901 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-tmp\") pod \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " Apr 16 20:20:34.840247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840067 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sgj2\" (UniqueName: \"kubernetes.io/projected/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kube-api-access-2sgj2\") pod \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\" (UID: \"131772a0-17e1-452b-a2e6-f3c19ae9a8f0\") " Apr 16 20:20:34.840247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840200 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "131772a0-17e1-452b-a2e6-f3c19ae9a8f0" (UID: "131772a0-17e1-452b-a2e6-f3c19ae9a8f0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:34.840247 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840215 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "131772a0-17e1-452b-a2e6-f3c19ae9a8f0" (UID: "131772a0-17e1-452b-a2e6-f3c19ae9a8f0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:34.840461 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840446 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:34.840516 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840470 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:34.840625 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840607 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "131772a0-17e1-452b-a2e6-f3c19ae9a8f0" (UID: "131772a0-17e1-452b-a2e6-f3c19ae9a8f0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:34.840687 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.840624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "131772a0-17e1-452b-a2e6-f3c19ae9a8f0" (UID: "131772a0-17e1-452b-a2e6-f3c19ae9a8f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:34.842179 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.842151 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "131772a0-17e1-452b-a2e6-f3c19ae9a8f0" (UID: "131772a0-17e1-452b-a2e6-f3c19ae9a8f0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:20:34.842322 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.842294 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kube-api-access-2sgj2" (OuterVolumeSpecName: "kube-api-access-2sgj2") pod "131772a0-17e1-452b-a2e6-f3c19ae9a8f0" (UID: "131772a0-17e1-452b-a2e6-f3c19ae9a8f0"). InnerVolumeSpecName "kube-api-access-2sgj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:34.941838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.941799 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:34.941838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.941831 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sgj2\" (UniqueName: \"kubernetes.io/projected/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kube-api-access-2sgj2\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:34.941838 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.941843 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:34.942189 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:34.941853 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131772a0-17e1-452b-a2e6-f3c19ae9a8f0-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:35.628892 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.628862 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" Apr 16 20:20:35.628892 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.628875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp" event={"ID":"131772a0-17e1-452b-a2e6-f3c19ae9a8f0","Type":"ContainerDied","Data":"95598c565a7f4ed54cb0ff2134dbbb1b3693b1678cffef95784a72a636b72b74"} Apr 16 20:20:35.629428 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.628925 2569 scope.go:117] "RemoveContainer" containerID="19a1c40fb9ff9281b7f8aa2c0c901c571ed403232ea22c4658608e07ef5ef2d6" Apr 16 20:20:35.638576 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.638561 2569 scope.go:117] "RemoveContainer" containerID="3ebf45669ab782556c7dcd5086330d2328ef6e777c7a07b5e973835e9795c186" Apr 16 20:20:35.648669 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.648650 2569 scope.go:117] "RemoveContainer" containerID="f22bfd9767bd095f3901750e9e1727c91792052adc73f220ca3574506fe1aadf" Apr 16 20:20:35.652511 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.652486 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp"] Apr 16 20:20:35.657667 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:35.657643 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7bbd4snpgp"] Apr 16 20:20:36.373534 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:36.373500 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" path="/var/lib/kubelet/pods/131772a0-17e1-452b-a2e6-f3c19ae9a8f0/volumes" Apr 16 20:20:49.025133 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025040 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg"] Apr 16 20:20:49.025621 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025600 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="main" Apr 16 20:20:49.025697 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025624 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="main" Apr 16 20:20:49.025697 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025646 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="storage-initializer" Apr 16 20:20:49.025697 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025655 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="storage-initializer" Apr 16 20:20:49.025697 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025665 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="tokenizer" Apr 16 20:20:49.025697 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025674 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="tokenizer" Apr 16 20:20:49.025950 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025791 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="tokenizer" Apr 16 20:20:49.025950 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.025806 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="131772a0-17e1-452b-a2e6-f3c19ae9a8f0" containerName="main" Apr 16 20:20:49.031047 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.031023 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.033832 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.033804 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 20:20:49.033947 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.033840 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-848c2\"" Apr 16 20:20:49.038799 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.038769 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg"] Apr 16 20:20:49.053824 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.053799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.053937 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.053848 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.053982 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.053929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.053982 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.053967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.054080 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.053996 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67lmn\" (UniqueName: \"kubernetes.io/projected/15787bc1-6b93-4752-84df-0e2ac94166f2-kube-api-access-67lmn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.054080 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.054065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15787bc1-6b93-4752-84df-0e2ac94166f2-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155230 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67lmn\" (UniqueName: \"kubernetes.io/projected/15787bc1-6b93-4752-84df-0e2ac94166f2-kube-api-access-67lmn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155380 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155369 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15787bc1-6b93-4752-84df-0e2ac94166f2-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155705 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155683 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.155874 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.155789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.157844 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.157823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15787bc1-6b93-4752-84df-0e2ac94166f2-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.163960 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.163906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67lmn\" (UniqueName: \"kubernetes.io/projected/15787bc1-6b93-4752-84df-0e2ac94166f2-kube-api-access-67lmn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.341675 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.341561 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:49.467057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.467004 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg"] Apr 16 20:20:49.467493 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:20:49.467459 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15787bc1_6b93_4752_84df_0e2ac94166f2.slice/crio-4c003302d186d194a423160e36c8cd5ee0d3a237e62fc8de9f99af35c334c602 WatchSource:0}: Error finding container 4c003302d186d194a423160e36c8cd5ee0d3a237e62fc8de9f99af35c334c602: Status 404 returned error can't find the container with id 4c003302d186d194a423160e36c8cd5ee0d3a237e62fc8de9f99af35c334c602 Apr 16 20:20:49.600559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.600487 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:20:49.687025 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.686984 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerStarted","Data":"c35d002f9a50ca62e20caa1366d0c90ff45363b032c85e06ec0307a58cce6c79"} Apr 16 20:20:49.687194 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:49.687044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerStarted","Data":"4c003302d186d194a423160e36c8cd5ee0d3a237e62fc8de9f99af35c334c602"} Apr 16 20:20:50.692304 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:50.692269 2569 generic.go:358] "Generic (PLEG): container finished" podID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerID="c35d002f9a50ca62e20caa1366d0c90ff45363b032c85e06ec0307a58cce6c79" exitCode=0 Apr 16 20:20:50.692702 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:50.692319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerDied","Data":"c35d002f9a50ca62e20caa1366d0c90ff45363b032c85e06ec0307a58cce6c79"} Apr 16 20:20:51.699303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:51.699263 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerStarted","Data":"552a5ba4af0b36232232158005b80a6b274e2639670f17666bff9b4a19387702"} Apr 16 20:20:51.699303 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:51.699307 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerStarted","Data":"60b11329a7086c16e353b7f3ad6309b4464787e18995201a9df426ba325d44f4"} Apr 16 20:20:51.699733 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:51.699385 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:51.719524 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:51.719461 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" podStartSLOduration=2.71944315 podStartE2EDuration="2.71944315s" podCreationTimestamp="2026-04-16 20:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:51.717264877 +0000 UTC m=+1615.917974956" watchObservedRunningTime="2026-04-16 20:20:51.71944315 +0000 UTC m=+1615.920153228" Apr 16 20:20:59.342189 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:59.342146 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:59.342670 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:59.342201 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:59.345033 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:59.344988 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:20:59.739495 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:20:59.739462 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:21:20.744663 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:21:20.744635 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:23:53.854908 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:53.854865 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4"] Apr 16 20:23:53.855432 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:53.855302 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="main" containerID="cri-o://0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e" gracePeriod=30 Apr 16 20:23:53.855505 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:53.855469 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="tokenizer" containerID="cri-o://3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d" gracePeriod=30 Apr 16 20:23:54.426803 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:54.426766 2569 generic.go:358] "Generic (PLEG): container finished" podID="97b1b693-df29-4d29-9033-6367f39f8952" containerID="0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e" exitCode=0 Apr 16 20:23:54.426982 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:54.426836 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerDied","Data":"0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e"} Apr 16 20:23:55.094209 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.094186 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:23:55.152387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152320 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-cache\") pod \"97b1b693-df29-4d29-9033-6367f39f8952\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " Apr 16 20:23:55.152387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152352 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-kserve-provision-location\") pod \"97b1b693-df29-4d29-9033-6367f39f8952\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " Apr 16 20:23:55.152387 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152387 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97b1b693-df29-4d29-9033-6367f39f8952-tls-certs\") pod \"97b1b693-df29-4d29-9033-6367f39f8952\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " Apr 16 20:23:55.152655 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152421 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-uds\") pod \"97b1b693-df29-4d29-9033-6367f39f8952\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " Apr 16 20:23:55.152655 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152446 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-tmp\") pod \"97b1b693-df29-4d29-9033-6367f39f8952\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " Apr 16 20:23:55.152655 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152477 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84px2\" (UniqueName: \"kubernetes.io/projected/97b1b693-df29-4d29-9033-6367f39f8952-kube-api-access-84px2\") pod \"97b1b693-df29-4d29-9033-6367f39f8952\" (UID: \"97b1b693-df29-4d29-9033-6367f39f8952\") " Apr 16 20:23:55.152655 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152623 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "97b1b693-df29-4d29-9033-6367f39f8952" (UID: "97b1b693-df29-4d29-9033-6367f39f8952"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.152861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152739 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "97b1b693-df29-4d29-9033-6367f39f8952" (UID: "97b1b693-df29-4d29-9033-6367f39f8952"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.152861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152772 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.152861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.152845 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "97b1b693-df29-4d29-9033-6367f39f8952" (UID: "97b1b693-df29-4d29-9033-6367f39f8952"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.153214 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.153189 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97b1b693-df29-4d29-9033-6367f39f8952" (UID: "97b1b693-df29-4d29-9033-6367f39f8952"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.154432 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.154416 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b1b693-df29-4d29-9033-6367f39f8952-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "97b1b693-df29-4d29-9033-6367f39f8952" (UID: "97b1b693-df29-4d29-9033-6367f39f8952"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:23:55.154609 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.154593 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b1b693-df29-4d29-9033-6367f39f8952-kube-api-access-84px2" (OuterVolumeSpecName: "kube-api-access-84px2") pod "97b1b693-df29-4d29-9033-6367f39f8952" (UID: "97b1b693-df29-4d29-9033-6367f39f8952"). InnerVolumeSpecName "kube-api-access-84px2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:23:55.254196 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.254171 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.254196 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.254197 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.254355 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.254206 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84px2\" (UniqueName: \"kubernetes.io/projected/97b1b693-df29-4d29-9033-6367f39f8952-kube-api-access-84px2\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.254355 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.254216 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97b1b693-df29-4d29-9033-6367f39f8952-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.254355 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.254226 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97b1b693-df29-4d29-9033-6367f39f8952-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.432454 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.432419 2569 generic.go:358] "Generic (PLEG): container finished" podID="97b1b693-df29-4d29-9033-6367f39f8952" containerID="3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d" exitCode=0 Apr 16 20:23:55.432619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.432478 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerDied","Data":"3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d"} Apr 16 20:23:55.432619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.432494 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" Apr 16 20:23:55.432619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.432502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4" event={"ID":"97b1b693-df29-4d29-9033-6367f39f8952","Type":"ContainerDied","Data":"49d0c8da8ad8d2fa73d5a2fb9b08ee2a336a0eef42fbc1f02d7da9dfee89e764"} Apr 16 20:23:55.432619 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.432517 2569 scope.go:117] "RemoveContainer" containerID="3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d" Apr 16 20:23:55.443267 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.443245 2569 scope.go:117] "RemoveContainer" containerID="0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e" Apr 16 20:23:55.451453 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.451435 2569 scope.go:117] "RemoveContainer" containerID="aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942" Apr 16 20:23:55.457461 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.457438 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4"] Apr 16 20:23:55.460751 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.460732 2569 scope.go:117] "RemoveContainer" containerID="3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d" Apr 16 20:23:55.460807 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.460773 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebb2p4"] Apr 16 20:23:55.460991 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:23:55.460970 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d\": container with ID starting with 3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d not found: ID does not exist" containerID="3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d" Apr 16 20:23:55.461055 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.460999 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d"} err="failed to get container status \"3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d\": rpc error: code = NotFound desc = could not find container \"3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d\": container with ID starting with 3a560c09193aa4e8c7ba13d312d37e51284639eb7dc28ba4a8814fdc1d121b7d not found: ID does not exist" Apr 16 20:23:55.461055 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.461040 2569 scope.go:117] "RemoveContainer" containerID="0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e" Apr 16 20:23:55.461253 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:23:55.461233 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e\": container with ID starting with 0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e not found: ID does not exist" containerID="0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e" Apr 16 20:23:55.461295 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.461259 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e"} err="failed to get container status \"0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e\": rpc error: code = NotFound desc = could not find container \"0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e\": container with ID starting with 0b3e48b3a5726bfca260dc846b96a9623c1755b23236cf4d578a481f1152583e not found: ID does not exist" Apr 16 20:23:55.461295 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.461274 2569 scope.go:117] "RemoveContainer" containerID="aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942" Apr 16 20:23:55.461510 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:23:55.461487 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942\": container with ID starting with aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942 not found: ID does not exist" containerID="aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942" Apr 16 20:23:55.461589 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:55.461514 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942"} err="failed to get container status \"aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942\": rpc error: code = NotFound desc = could not find container \"aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942\": container with ID starting with aedf70a3e01b52944c423e31f0ba29c6ef0fc25590943862cc73dfc0c444e942 not found: ID does not exist" Apr 16 20:23:56.373177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:56.373146 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b1b693-df29-4d29-9033-6367f39f8952" path="/var/lib/kubelet/pods/97b1b693-df29-4d29-9033-6367f39f8952/volumes" Apr 16 20:23:56.420183 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:56.420154 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:23:56.427749 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:23:56.427730 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:24:06.653662 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.653627 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7"] Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654001 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="tokenizer" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654026 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="tokenizer" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654054 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="storage-initializer" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654060 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="storage-initializer" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654066 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="main" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654071 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="main" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654133 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="main" Apr 16 20:24:06.654177 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.654142 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="97b1b693-df29-4d29-9033-6367f39f8952" containerName="tokenizer" Apr 16 20:24:06.658797 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.658782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.661119 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.661097 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 20:24:06.668829 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.668804 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7"] Apr 16 20:24:06.759275 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.759233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-model-cache\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.759275 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.759277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pg25\" (UniqueName: \"kubernetes.io/projected/27739d4c-9802-413b-9ccc-4bf41af3df60-kube-api-access-8pg25\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.759525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.759362 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-dshm\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.759525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.759426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/27739d4c-9802-413b-9ccc-4bf41af3df60-tls-certs\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.759525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.759463 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-home\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.759525 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.759493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860324 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-dshm\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860512 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/27739d4c-9802-413b-9ccc-4bf41af3df60-tls-certs\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860512 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-home\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860512 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860512 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-model-cache\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860512 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pg25\" (UniqueName: \"kubernetes.io/projected/27739d4c-9802-413b-9ccc-4bf41af3df60-kube-api-access-8pg25\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860814 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-home\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860880 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-model-cache\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.860880 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.860847 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.862746 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.862716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-dshm\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.862846 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.862803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/27739d4c-9802-413b-9ccc-4bf41af3df60-tls-certs\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.871436 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.871389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pg25\" (UniqueName: \"kubernetes.io/projected/27739d4c-9802-413b-9ccc-4bf41af3df60-kube-api-access-8pg25\") pod \"scheduler-inline-config-test-kserve-577b779467-mmwx7\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:06.969471 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:06.969418 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:07.298666 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:07.298641 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7"] Apr 16 20:24:07.300154 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:24:07.300123 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27739d4c_9802_413b_9ccc_4bf41af3df60.slice/crio-002926368dbaa5b4779126d1a9c655473d53936a1b1d61aa3fd378f62f319ecb WatchSource:0}: Error finding container 002926368dbaa5b4779126d1a9c655473d53936a1b1d61aa3fd378f62f319ecb: Status 404 returned error can't find the container with id 002926368dbaa5b4779126d1a9c655473d53936a1b1d61aa3fd378f62f319ecb Apr 16 20:24:07.302131 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:07.302112 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:24:07.486959 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:07.486920 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" event={"ID":"27739d4c-9802-413b-9ccc-4bf41af3df60","Type":"ContainerStarted","Data":"b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79"} Apr 16 20:24:07.486959 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:07.486963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" event={"ID":"27739d4c-9802-413b-9ccc-4bf41af3df60","Type":"ContainerStarted","Data":"002926368dbaa5b4779126d1a9c655473d53936a1b1d61aa3fd378f62f319ecb"} Apr 16 20:24:11.504779 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:11.504740 2569 generic.go:358] "Generic (PLEG): container finished" podID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerID="b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79" exitCode=0 Apr 16 20:24:11.505152 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:11.504815 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" event={"ID":"27739d4c-9802-413b-9ccc-4bf41af3df60","Type":"ContainerDied","Data":"b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79"} Apr 16 20:24:12.510084 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:12.510048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" event={"ID":"27739d4c-9802-413b-9ccc-4bf41af3df60","Type":"ContainerStarted","Data":"b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765"} Apr 16 20:24:12.528487 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:12.528432 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" podStartSLOduration=6.52841497 podStartE2EDuration="6.52841497s" podCreationTimestamp="2026-04-16 20:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:12.527376237 +0000 UTC m=+1816.728086316" watchObservedRunningTime="2026-04-16 20:24:12.52841497 +0000 UTC m=+1816.729125061" Apr 16 20:24:16.969939 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:16.969904 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:16.969939 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:16.969946 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:16.982698 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:16.982676 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:17.541192 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:17.541164 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:22.468510 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:22.468471 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg"] Apr 16 20:24:22.468972 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:22.468815 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="main" containerID="cri-o://60b11329a7086c16e353b7f3ad6309b4464787e18995201a9df426ba325d44f4" gracePeriod=30 Apr 16 20:24:22.468972 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:22.468892 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="tokenizer" containerID="cri-o://552a5ba4af0b36232232158005b80a6b274e2639670f17666bff9b4a19387702" gracePeriod=30 Apr 16 20:24:23.559222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.559182 2569 generic.go:358] "Generic (PLEG): container finished" podID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerID="552a5ba4af0b36232232158005b80a6b274e2639670f17666bff9b4a19387702" exitCode=0 Apr 16 20:24:23.559222 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.559215 2569 generic.go:358] "Generic (PLEG): container finished" podID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerID="60b11329a7086c16e353b7f3ad6309b4464787e18995201a9df426ba325d44f4" exitCode=0 Apr 16 20:24:23.559715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.559267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerDied","Data":"552a5ba4af0b36232232158005b80a6b274e2639670f17666bff9b4a19387702"} Apr 16 20:24:23.559715 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.559311 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerDied","Data":"60b11329a7086c16e353b7f3ad6309b4464787e18995201a9df426ba325d44f4"} Apr 16 20:24:23.681005 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.680983 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:24:23.705268 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705232 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15787bc1-6b93-4752-84df-0e2ac94166f2-tls-certs\") pod \"15787bc1-6b93-4752-84df-0e2ac94166f2\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " Apr 16 20:24:23.705431 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705285 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67lmn\" (UniqueName: \"kubernetes.io/projected/15787bc1-6b93-4752-84df-0e2ac94166f2-kube-api-access-67lmn\") pod \"15787bc1-6b93-4752-84df-0e2ac94166f2\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " Apr 16 20:24:23.705431 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705316 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-kserve-provision-location\") pod \"15787bc1-6b93-4752-84df-0e2ac94166f2\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " Apr 16 20:24:23.705431 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705358 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-tmp\") pod \"15787bc1-6b93-4752-84df-0e2ac94166f2\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " Apr 16 20:24:23.705608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705431 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-cache\") pod \"15787bc1-6b93-4752-84df-0e2ac94166f2\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " Apr 16 20:24:23.705608 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705460 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-uds\") pod \"15787bc1-6b93-4752-84df-0e2ac94166f2\" (UID: \"15787bc1-6b93-4752-84df-0e2ac94166f2\") " Apr 16 20:24:23.705875 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705842 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "15787bc1-6b93-4752-84df-0e2ac94166f2" (UID: "15787bc1-6b93-4752-84df-0e2ac94166f2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:23.705999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705871 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "15787bc1-6b93-4752-84df-0e2ac94166f2" (UID: "15787bc1-6b93-4752-84df-0e2ac94166f2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:23.705999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.705938 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "15787bc1-6b93-4752-84df-0e2ac94166f2" (UID: "15787bc1-6b93-4752-84df-0e2ac94166f2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:23.706352 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.706323 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15787bc1-6b93-4752-84df-0e2ac94166f2" (UID: "15787bc1-6b93-4752-84df-0e2ac94166f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:23.707467 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.707441 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15787bc1-6b93-4752-84df-0e2ac94166f2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "15787bc1-6b93-4752-84df-0e2ac94166f2" (UID: "15787bc1-6b93-4752-84df-0e2ac94166f2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:23.707559 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.707472 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15787bc1-6b93-4752-84df-0e2ac94166f2-kube-api-access-67lmn" (OuterVolumeSpecName: "kube-api-access-67lmn") pod "15787bc1-6b93-4752-84df-0e2ac94166f2" (UID: "15787bc1-6b93-4752-84df-0e2ac94166f2"). InnerVolumeSpecName "kube-api-access-67lmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:23.806652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.806619 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:23.806652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.806647 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-uds\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:23.806652 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.806657 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15787bc1-6b93-4752-84df-0e2ac94166f2-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:23.806900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.806666 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67lmn\" (UniqueName: \"kubernetes.io/projected/15787bc1-6b93-4752-84df-0e2ac94166f2-kube-api-access-67lmn\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:23.806900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.806676 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:23.806900 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:23.806685 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15787bc1-6b93-4752-84df-0e2ac94166f2-tokenizer-tmp\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:24.565192 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.565166 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" Apr 16 20:24:24.565624 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.565165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg" event={"ID":"15787bc1-6b93-4752-84df-0e2ac94166f2","Type":"ContainerDied","Data":"4c003302d186d194a423160e36c8cd5ee0d3a237e62fc8de9f99af35c334c602"} Apr 16 20:24:24.565624 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.565287 2569 scope.go:117] "RemoveContainer" containerID="552a5ba4af0b36232232158005b80a6b274e2639670f17666bff9b4a19387702" Apr 16 20:24:24.573983 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.573966 2569 scope.go:117] "RemoveContainer" containerID="60b11329a7086c16e353b7f3ad6309b4464787e18995201a9df426ba325d44f4" Apr 16 20:24:24.581594 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.581556 2569 scope.go:117] "RemoveContainer" containerID="c35d002f9a50ca62e20caa1366d0c90ff45363b032c85e06ec0307a58cce6c79" Apr 16 20:24:24.585129 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.585088 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg"] Apr 16 20:24:24.588409 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:24.588389 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-77859c4d66ksqg"] Apr 16 20:24:26.372681 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:26.372650 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" path="/var/lib/kubelet/pods/15787bc1-6b93-4752-84df-0e2ac94166f2/volumes" Apr 16 20:24:48.733375 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:48.733344 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7"] Apr 16 20:24:48.733873 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:48.733613 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerName="main" containerID="cri-o://b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765" gracePeriod=30 Apr 16 20:24:48.979965 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:48.979943 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:49.126060 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.125957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-model-cache\") pod \"27739d4c-9802-413b-9ccc-4bf41af3df60\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " Apr 16 20:24:49.126060 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.125995 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-dshm\") pod \"27739d4c-9802-413b-9ccc-4bf41af3df60\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " Apr 16 20:24:49.126060 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126053 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/27739d4c-9802-413b-9ccc-4bf41af3df60-tls-certs\") pod \"27739d4c-9802-413b-9ccc-4bf41af3df60\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " Apr 16 20:24:49.126346 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126073 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pg25\" (UniqueName: \"kubernetes.io/projected/27739d4c-9802-413b-9ccc-4bf41af3df60-kube-api-access-8pg25\") pod \"27739d4c-9802-413b-9ccc-4bf41af3df60\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " Apr 16 20:24:49.126346 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126109 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-home\") pod \"27739d4c-9802-413b-9ccc-4bf41af3df60\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " Apr 16 20:24:49.126346 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126139 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-kserve-provision-location\") pod \"27739d4c-9802-413b-9ccc-4bf41af3df60\" (UID: \"27739d4c-9802-413b-9ccc-4bf41af3df60\") " Apr 16 20:24:49.126346 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126273 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-model-cache" (OuterVolumeSpecName: "model-cache") pod "27739d4c-9802-413b-9ccc-4bf41af3df60" (UID: "27739d4c-9802-413b-9ccc-4bf41af3df60"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:49.126533 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126445 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-home" (OuterVolumeSpecName: "home") pod "27739d4c-9802-413b-9ccc-4bf41af3df60" (UID: "27739d4c-9802-413b-9ccc-4bf41af3df60"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:49.126533 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.126470 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-model-cache\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:49.128202 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.128177 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27739d4c-9802-413b-9ccc-4bf41af3df60-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "27739d4c-9802-413b-9ccc-4bf41af3df60" (UID: "27739d4c-9802-413b-9ccc-4bf41af3df60"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:49.128311 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.128274 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27739d4c-9802-413b-9ccc-4bf41af3df60-kube-api-access-8pg25" (OuterVolumeSpecName: "kube-api-access-8pg25") pod "27739d4c-9802-413b-9ccc-4bf41af3df60" (UID: "27739d4c-9802-413b-9ccc-4bf41af3df60"). InnerVolumeSpecName "kube-api-access-8pg25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:49.128311 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.128284 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-dshm" (OuterVolumeSpecName: "dshm") pod "27739d4c-9802-413b-9ccc-4bf41af3df60" (UID: "27739d4c-9802-413b-9ccc-4bf41af3df60"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:49.182176 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.182133 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "27739d4c-9802-413b-9ccc-4bf41af3df60" (UID: "27739d4c-9802-413b-9ccc-4bf41af3df60"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:49.227285 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.227245 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-dshm\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:49.227285 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.227283 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/27739d4c-9802-413b-9ccc-4bf41af3df60-tls-certs\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:49.227486 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.227297 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pg25\" (UniqueName: \"kubernetes.io/projected/27739d4c-9802-413b-9ccc-4bf41af3df60-kube-api-access-8pg25\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:49.227486 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.227306 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-home\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:49.227486 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.227316 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27739d4c-9802-413b-9ccc-4bf41af3df60-kserve-provision-location\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:24:49.663300 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.663258 2569 generic.go:358] "Generic (PLEG): container finished" podID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerID="b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765" exitCode=0 Apr 16 20:24:49.663465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.663335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" event={"ID":"27739d4c-9802-413b-9ccc-4bf41af3df60","Type":"ContainerDied","Data":"b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765"} Apr 16 20:24:49.663465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.663362 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" Apr 16 20:24:49.663465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.663374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7" event={"ID":"27739d4c-9802-413b-9ccc-4bf41af3df60","Type":"ContainerDied","Data":"002926368dbaa5b4779126d1a9c655473d53936a1b1d61aa3fd378f62f319ecb"} Apr 16 20:24:49.663465 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.663390 2569 scope.go:117] "RemoveContainer" containerID="b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765" Apr 16 20:24:49.674169 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.674153 2569 scope.go:117] "RemoveContainer" containerID="b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79" Apr 16 20:24:49.689994 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.689971 2569 scope.go:117] "RemoveContainer" containerID="b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765" Apr 16 20:24:49.690318 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:24:49.690292 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765\": container with ID starting with b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765 not found: ID does not exist" containerID="b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765" Apr 16 20:24:49.690424 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.690330 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765"} err="failed to get container status \"b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765\": rpc error: code = NotFound desc = could not find container \"b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765\": container with ID starting with b047fc09498cbcd77d8d65db90ca5ed70a3b50b42e8bf60503472fbb65c7b765 not found: ID does not exist" Apr 16 20:24:49.690424 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.690356 2569 scope.go:117] "RemoveContainer" containerID="b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79" Apr 16 20:24:49.690649 ip-10-0-138-142 kubenswrapper[2569]: E0416 20:24:49.690631 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79\": container with ID starting with b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79 not found: ID does not exist" containerID="b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79" Apr 16 20:24:49.690723 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.690655 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79"} err="failed to get container status \"b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79\": rpc error: code = NotFound desc = could not find container \"b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79\": container with ID starting with b6b0aeb5d6e676bfa24de9262fcb3e974de94368ff6acb02be855839d245bf79 not found: ID does not exist" Apr 16 20:24:49.691999 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.691974 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7"] Apr 16 20:24:49.697593 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:49.697573 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-577b779467-mmwx7"] Apr 16 20:24:50.372232 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:24:50.372196 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" path="/var/lib/kubelet/pods/27739d4c-9802-413b-9ccc-4bf41af3df60/volumes" Apr 16 20:26:28.645543 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645511 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-crr8h/must-gather-vpkpd"] Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645858 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="tokenizer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645869 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="tokenizer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645881 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerName="main" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645887 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerName="main" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645896 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="storage-initializer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645901 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="storage-initializer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645911 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerName="storage-initializer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645916 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerName="storage-initializer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645924 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="main" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645930 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="main" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645985 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="main" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.645996 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="15787bc1-6b93-4752-84df-0e2ac94166f2" containerName="tokenizer" Apr 16 20:26:28.646057 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.646002 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="27739d4c-9802-413b-9ccc-4bf41af3df60" containerName="main" Apr 16 20:26:28.649095 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.649075 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.651376 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.651351 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-crr8h\"/\"openshift-service-ca.crt\"" Apr 16 20:26:28.651509 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.651423 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-crr8h\"/\"default-dockercfg-k9nc8\"" Apr 16 20:26:28.651509 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.651433 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-crr8h\"/\"kube-root-ca.crt\"" Apr 16 20:26:28.656350 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.656054 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-crr8h/must-gather-vpkpd"] Apr 16 20:26:28.767245 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.767212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71b8ec58-132b-4c37-b994-052b42e7bb35-must-gather-output\") pod \"must-gather-vpkpd\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.767429 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.767289 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlm7z\" (UniqueName: \"kubernetes.io/projected/71b8ec58-132b-4c37-b994-052b42e7bb35-kube-api-access-mlm7z\") pod \"must-gather-vpkpd\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.868578 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.868539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlm7z\" (UniqueName: \"kubernetes.io/projected/71b8ec58-132b-4c37-b994-052b42e7bb35-kube-api-access-mlm7z\") pod \"must-gather-vpkpd\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.868746 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.868641 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71b8ec58-132b-4c37-b994-052b42e7bb35-must-gather-output\") pod \"must-gather-vpkpd\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.869045 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.869000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71b8ec58-132b-4c37-b994-052b42e7bb35-must-gather-output\") pod \"must-gather-vpkpd\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.876288 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.876266 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlm7z\" (UniqueName: \"kubernetes.io/projected/71b8ec58-132b-4c37-b994-052b42e7bb35-kube-api-access-mlm7z\") pod \"must-gather-vpkpd\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:28.958561 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:28.958531 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:26:29.085584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:29.085558 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-crr8h/must-gather-vpkpd"] Apr 16 20:26:29.086637 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:26:29.086605 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b8ec58_132b_4c37_b994_052b42e7bb35.slice/crio-c7722c41bc97a4e827cf6da86ce8912cbc78d6c3d4b20b946886a1878152f18f WatchSource:0}: Error finding container c7722c41bc97a4e827cf6da86ce8912cbc78d6c3d4b20b946886a1878152f18f: Status 404 returned error can't find the container with id c7722c41bc97a4e827cf6da86ce8912cbc78d6c3d4b20b946886a1878152f18f Apr 16 20:26:30.040225 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:30.040173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crr8h/must-gather-vpkpd" event={"ID":"71b8ec58-132b-4c37-b994-052b42e7bb35","Type":"ContainerStarted","Data":"c7722c41bc97a4e827cf6da86ce8912cbc78d6c3d4b20b946886a1878152f18f"} Apr 16 20:26:35.065323 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:35.065277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crr8h/must-gather-vpkpd" event={"ID":"71b8ec58-132b-4c37-b994-052b42e7bb35","Type":"ContainerStarted","Data":"a8867cc3dc19c89876170b18fc2c20c4add2248f87592ff066be01e3ec7b5b70"} Apr 16 20:26:35.065861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:35.065331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crr8h/must-gather-vpkpd" event={"ID":"71b8ec58-132b-4c37-b994-052b42e7bb35","Type":"ContainerStarted","Data":"e3ce3b4756bff712f9224847ef5082dd7f40ac617bf00a9aceadb1e9c2ec43ef"} Apr 16 20:26:35.083714 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:35.083664 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-crr8h/must-gather-vpkpd" podStartSLOduration=2.1860657310000002 podStartE2EDuration="7.083650298s" podCreationTimestamp="2026-04-16 20:26:28 +0000 UTC" firstStartedPulling="2026-04-16 20:26:29.088330269 +0000 UTC m=+1953.289040325" lastFinishedPulling="2026-04-16 20:26:33.985914835 +0000 UTC m=+1958.186624892" observedRunningTime="2026-04-16 20:26:35.081957501 +0000 UTC m=+1959.282667580" watchObservedRunningTime="2026-04-16 20:26:35.083650298 +0000 UTC m=+1959.284360448" Apr 16 20:26:43.331648 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:43.331618 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:44.337180 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:44.337147 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:45.327240 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:45.327156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:46.288345 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:46.288317 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:47.244054 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:47.244020 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:48.189061 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:48.189035 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:49.140566 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:49.140537 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:50.095923 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:50.095895 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:51.064810 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:51.064780 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:52.004685 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:52.004638 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:52.926636 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:52.926605 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:53.855285 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:53.855248 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:54.813870 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:54.813838 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:55.803814 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:55.803782 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-sxm99_a29a5ff4-0cb8-4250-83fc-e9c21ac602db/istio-proxy/0.log" Apr 16 20:26:58.463250 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:58.463219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9tfj_7367c59c-39ce-4fba-aca1-183d27d4e066/manager/0.log" Apr 16 20:26:58.541859 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:58.541829 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-b4bhk_db90142b-95ef-4184-821c-f3378d1c51cb/limitador/0.log" Apr 16 20:26:58.557556 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:26:58.557511 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-l27sq_91b14a1d-981f-4f41-a141-c3f4f8da74e8/manager/0.log" Apr 16 20:27:00.166956 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.166922 2569 generic.go:358] "Generic (PLEG): container finished" podID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerID="e3ce3b4756bff712f9224847ef5082dd7f40ac617bf00a9aceadb1e9c2ec43ef" exitCode=0 Apr 16 20:27:00.167414 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.166987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crr8h/must-gather-vpkpd" event={"ID":"71b8ec58-132b-4c37-b994-052b42e7bb35","Type":"ContainerDied","Data":"e3ce3b4756bff712f9224847ef5082dd7f40ac617bf00a9aceadb1e9c2ec43ef"} Apr 16 20:27:00.167414 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.167313 2569 scope.go:117] "RemoveContainer" containerID="e3ce3b4756bff712f9224847ef5082dd7f40ac617bf00a9aceadb1e9c2ec43ef" Apr 16 20:27:00.246172 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.246138 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crr8h_must-gather-vpkpd_71b8ec58-132b-4c37-b994-052b42e7bb35/gather/0.log" Apr 16 20:27:00.907002 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.906920 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-plvk7/must-gather-j7lb7"] Apr 16 20:27:00.911861 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.911840 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:00.914497 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.914478 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-plvk7\"/\"kube-root-ca.crt\"" Apr 16 20:27:00.915467 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.915436 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-plvk7\"/\"openshift-service-ca.crt\"" Apr 16 20:27:00.915714 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.915694 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-plvk7\"/\"default-dockercfg-57lmf\"" Apr 16 20:27:00.918685 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.918661 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/must-gather-j7lb7"] Apr 16 20:27:00.962083 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.962057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2fbd9f88-6e15-4a26-ba69-ef6010b31402-must-gather-output\") pod \"must-gather-j7lb7\" (UID: \"2fbd9f88-6e15-4a26-ba69-ef6010b31402\") " pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:00.962249 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:00.962114 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmc2\" (UniqueName: \"kubernetes.io/projected/2fbd9f88-6e15-4a26-ba69-ef6010b31402-kube-api-access-xcmc2\") pod \"must-gather-j7lb7\" (UID: \"2fbd9f88-6e15-4a26-ba69-ef6010b31402\") " pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:01.063036 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:01.062974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2fbd9f88-6e15-4a26-ba69-ef6010b31402-must-gather-output\") pod \"must-gather-j7lb7\" (UID: \"2fbd9f88-6e15-4a26-ba69-ef6010b31402\") " pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:01.063216 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:01.063049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmc2\" (UniqueName: \"kubernetes.io/projected/2fbd9f88-6e15-4a26-ba69-ef6010b31402-kube-api-access-xcmc2\") pod \"must-gather-j7lb7\" (UID: \"2fbd9f88-6e15-4a26-ba69-ef6010b31402\") " pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:01.063327 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:01.063307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2fbd9f88-6e15-4a26-ba69-ef6010b31402-must-gather-output\") pod \"must-gather-j7lb7\" (UID: \"2fbd9f88-6e15-4a26-ba69-ef6010b31402\") " pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:01.091790 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:01.091759 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmc2\" (UniqueName: \"kubernetes.io/projected/2fbd9f88-6e15-4a26-ba69-ef6010b31402-kube-api-access-xcmc2\") pod \"must-gather-j7lb7\" (UID: \"2fbd9f88-6e15-4a26-ba69-ef6010b31402\") " pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:01.221851 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:01.221767 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/must-gather-j7lb7" Apr 16 20:27:01.353331 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:01.353301 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/must-gather-j7lb7"] Apr 16 20:27:01.354305 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:27:01.354280 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fbd9f88_6e15_4a26_ba69_ef6010b31402.slice/crio-70a86eb2e675b91e86f61f36ec792a655996b7b14049fdb9da609fb069cfa4bb WatchSource:0}: Error finding container 70a86eb2e675b91e86f61f36ec792a655996b7b14049fdb9da609fb069cfa4bb: Status 404 returned error can't find the container with id 70a86eb2e675b91e86f61f36ec792a655996b7b14049fdb9da609fb069cfa4bb Apr 16 20:27:02.176721 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:02.176679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/must-gather-j7lb7" event={"ID":"2fbd9f88-6e15-4a26-ba69-ef6010b31402","Type":"ContainerStarted","Data":"70a86eb2e675b91e86f61f36ec792a655996b7b14049fdb9da609fb069cfa4bb"} Apr 16 20:27:03.183889 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:03.183854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/must-gather-j7lb7" event={"ID":"2fbd9f88-6e15-4a26-ba69-ef6010b31402","Type":"ContainerStarted","Data":"7fcd461a9ed1cd41c7b2606b9933ec8791f983c2a99edd4b071978671c60046c"} Apr 16 20:27:03.183889 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:03.183896 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/must-gather-j7lb7" event={"ID":"2fbd9f88-6e15-4a26-ba69-ef6010b31402","Type":"ContainerStarted","Data":"e94c3f5979aa1d9bc53fc757d25b9777f2af6649eff1b65bea2aebec4bcde24e"} Apr 16 20:27:03.205172 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:03.205110 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-plvk7/must-gather-j7lb7" podStartSLOduration=2.28968074 podStartE2EDuration="3.205090001s" podCreationTimestamp="2026-04-16 20:27:00 +0000 UTC" firstStartedPulling="2026-04-16 20:27:01.356383587 +0000 UTC m=+1985.557093642" lastFinishedPulling="2026-04-16 20:27:02.271792832 +0000 UTC m=+1986.472502903" observedRunningTime="2026-04-16 20:27:03.203352723 +0000 UTC m=+1987.404062802" watchObservedRunningTime="2026-04-16 20:27:03.205090001 +0000 UTC m=+1987.405800080" Apr 16 20:27:03.856118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:03.856086 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zfltp_8513c710-eca6-4743-9c4a-9f4603f59a26/global-pull-secret-syncer/0.log" Apr 16 20:27:03.956517 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:03.956489 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lb4zl_267f5c95-39db-40d0-a78a-839da0347dfc/konnectivity-agent/0.log" Apr 16 20:27:04.023881 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:04.023853 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-142.ec2.internal_027943e939a2d76cdb600f777d89968b/haproxy/0.log" Apr 16 20:27:05.830780 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:05.830743 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-crr8h/must-gather-vpkpd"] Apr 16 20:27:05.831672 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:05.831636 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-crr8h/must-gather-vpkpd" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="copy" containerID="cri-o://a8867cc3dc19c89876170b18fc2c20c4add2248f87592ff066be01e3ec7b5b70" gracePeriod=2 Apr 16 20:27:05.834230 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:05.834202 2569 status_manager.go:895] "Failed to get status for pod" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" pod="openshift-must-gather-crr8h/must-gather-vpkpd" err="pods \"must-gather-vpkpd\" is forbidden: User \"system:node:ip-10-0-138-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-crr8h\": no relationship found between node 'ip-10-0-138-142.ec2.internal' and this object" Apr 16 20:27:05.834588 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:05.834562 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-crr8h/must-gather-vpkpd"] Apr 16 20:27:06.208328 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.204414 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crr8h_must-gather-vpkpd_71b8ec58-132b-4c37-b994-052b42e7bb35/copy/0.log" Apr 16 20:27:06.208328 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.204804 2569 generic.go:358] "Generic (PLEG): container finished" podID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerID="a8867cc3dc19c89876170b18fc2c20c4add2248f87592ff066be01e3ec7b5b70" exitCode=143 Apr 16 20:27:06.234869 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.234623 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crr8h_must-gather-vpkpd_71b8ec58-132b-4c37-b994-052b42e7bb35/copy/0.log" Apr 16 20:27:06.235561 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.235234 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:27:06.237649 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.237608 2569 status_manager.go:895] "Failed to get status for pod" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" pod="openshift-must-gather-crr8h/must-gather-vpkpd" err="pods \"must-gather-vpkpd\" is forbidden: User \"system:node:ip-10-0-138-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-crr8h\": no relationship found between node 'ip-10-0-138-142.ec2.internal' and this object" Apr 16 20:27:06.324729 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.324030 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71b8ec58-132b-4c37-b994-052b42e7bb35-must-gather-output\") pod \"71b8ec58-132b-4c37-b994-052b42e7bb35\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " Apr 16 20:27:06.324729 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.324188 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlm7z\" (UniqueName: \"kubernetes.io/projected/71b8ec58-132b-4c37-b994-052b42e7bb35-kube-api-access-mlm7z\") pod \"71b8ec58-132b-4c37-b994-052b42e7bb35\" (UID: \"71b8ec58-132b-4c37-b994-052b42e7bb35\") " Apr 16 20:27:06.331581 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.331394 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b8ec58-132b-4c37-b994-052b42e7bb35-kube-api-access-mlm7z" (OuterVolumeSpecName: "kube-api-access-mlm7z") pod "71b8ec58-132b-4c37-b994-052b42e7bb35" (UID: "71b8ec58-132b-4c37-b994-052b42e7bb35"). InnerVolumeSpecName "kube-api-access-mlm7z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:27:06.335332 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.333735 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b8ec58-132b-4c37-b994-052b42e7bb35-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "71b8ec58-132b-4c37-b994-052b42e7bb35" (UID: "71b8ec58-132b-4c37-b994-052b42e7bb35"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:06.376151 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.375990 2569 status_manager.go:895] "Failed to get status for pod" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" pod="openshift-must-gather-crr8h/must-gather-vpkpd" err="pods \"must-gather-vpkpd\" is forbidden: User \"system:node:ip-10-0-138-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-crr8h\": no relationship found between node 'ip-10-0-138-142.ec2.internal' and this object" Apr 16 20:27:06.377928 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.377894 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" path="/var/lib/kubelet/pods/71b8ec58-132b-4c37-b994-052b42e7bb35/volumes" Apr 16 20:27:06.425577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.425512 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlm7z\" (UniqueName: \"kubernetes.io/projected/71b8ec58-132b-4c37-b994-052b42e7bb35-kube-api-access-mlm7z\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:27:06.425577 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:06.425549 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71b8ec58-132b-4c37-b994-052b42e7bb35-must-gather-output\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 16 20:27:07.211361 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:07.211329 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crr8h_must-gather-vpkpd_71b8ec58-132b-4c37-b994-052b42e7bb35/copy/0.log" Apr 16 20:27:07.211945 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:07.211870 2569 scope.go:117] "RemoveContainer" containerID="a8867cc3dc19c89876170b18fc2c20c4add2248f87592ff066be01e3ec7b5b70" Apr 16 20:27:07.212066 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:07.212047 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crr8h/must-gather-vpkpd" Apr 16 20:27:07.224022 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:07.223981 2569 scope.go:117] "RemoveContainer" containerID="e3ce3b4756bff712f9224847ef5082dd7f40ac617bf00a9aceadb1e9c2ec43ef" Apr 16 20:27:08.167085 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:08.166997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9tfj_7367c59c-39ce-4fba-aca1-183d27d4e066/manager/0.log" Apr 16 20:27:08.283765 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:08.283588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-b4bhk_db90142b-95ef-4184-821c-f3378d1c51cb/limitador/0.log" Apr 16 20:27:08.314791 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:08.314698 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-l27sq_91b14a1d-981f-4f41-a141-c3f4f8da74e8/manager/0.log" Apr 16 20:27:09.519848 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:09.519819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5648b74476-mzrcm_663570b5-b814-4a5e-96a2-36a946159c78/metrics-server/0.log" Apr 16 20:27:09.578142 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:09.578111 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cx9gk_b518de3d-357a-4cb8-987f-4bea66391003/node-exporter/0.log" Apr 16 20:27:09.603232 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:09.603197 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cx9gk_b518de3d-357a-4cb8-987f-4bea66391003/kube-rbac-proxy/0.log" Apr 16 20:27:09.627690 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:09.627658 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cx9gk_b518de3d-357a-4cb8-987f-4bea66391003/init-textfile/0.log" Apr 16 20:27:10.195917 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:10.195886 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-ddd77cfb9-dq2tw_61d996e3-12d0-435c-8089-44bb6cb2698a/telemeter-client/0.log" Apr 16 20:27:10.221032 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:10.220979 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-ddd77cfb9-dq2tw_61d996e3-12d0-435c-8089-44bb6cb2698a/reload/0.log" Apr 16 20:27:10.247118 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:10.247089 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-ddd77cfb9-dq2tw_61d996e3-12d0-435c-8089-44bb6cb2698a/kube-rbac-proxy/0.log" Apr 16 20:27:12.496007 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.495979 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b67ccf97f-82jcv_90b10d0d-c518-439e-8261-0358666add06/console/0.log" Apr 16 20:27:12.796544 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796457 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx"] Apr 16 20:27:12.796844 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796829 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="copy" Apr 16 20:27:12.796844 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796843 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="copy" Apr 16 20:27:12.796986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796857 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="gather" Apr 16 20:27:12.796986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796862 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="gather" Apr 16 20:27:12.796986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796913 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="gather" Apr 16 20:27:12.796986 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.796926 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="71b8ec58-132b-4c37-b994-052b42e7bb35" containerName="copy" Apr 16 20:27:12.802867 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.802845 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.810310 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.810281 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx"] Apr 16 20:27:12.898138 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.898103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-proc\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.898315 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.898156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-podres\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.898315 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.898296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-sys\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.898432 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.898327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-lib-modules\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.898432 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.898369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkk8\" (UniqueName: \"kubernetes.io/projected/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-kube-api-access-9zkk8\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.998965 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.998933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-proc\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.998973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-podres\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-sys\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-lib-modules\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-proc\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkk8\" (UniqueName: \"kubernetes.io/projected/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-kube-api-access-9zkk8\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-sys\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999392 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999185 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-lib-modules\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:12.999392 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:12.999193 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-podres\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:13.007232 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:13.007211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkk8\" (UniqueName: \"kubernetes.io/projected/dfb00dc3-c708-4239-a8c6-4587fbfc21f2-kube-api-access-9zkk8\") pod \"perf-node-gather-daemonset-zsrpx\" (UID: \"dfb00dc3-c708-4239-a8c6-4587fbfc21f2\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:13.117096 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:13.117007 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:13.251624 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:13.251601 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx"] Apr 16 20:27:13.252720 ip-10-0-138-142 kubenswrapper[2569]: W0416 20:27:13.252689 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddfb00dc3_c708_4239_a8c6_4587fbfc21f2.slice/crio-081367a14a40dd05dda2d4b25442f65cbc4293ee4767aeef4de651ab428da74b WatchSource:0}: Error finding container 081367a14a40dd05dda2d4b25442f65cbc4293ee4767aeef4de651ab428da74b: Status 404 returned error can't find the container with id 081367a14a40dd05dda2d4b25442f65cbc4293ee4767aeef4de651ab428da74b Apr 16 20:27:13.782614 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:13.782583 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vslwr_731a4df4-ce0e-4549-8b0a-37f41083e8a3/dns/0.log" Apr 16 20:27:13.802379 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:13.802353 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vslwr_731a4df4-ce0e-4549-8b0a-37f41083e8a3/kube-rbac-proxy/0.log" Apr 16 20:27:13.847596 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:13.847566 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cz4rh_3ac8b4fc-9197-47f1-9c7f-794db0590349/dns-node-resolver/0.log" Apr 16 20:27:14.247327 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:14.247283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" event={"ID":"dfb00dc3-c708-4239-a8c6-4587fbfc21f2","Type":"ContainerStarted","Data":"124469806de4d461ca7ad27650bf03c7e382b903b2a3d16e1882ad4636e9b03b"} Apr 16 20:27:14.247521 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:14.247333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" event={"ID":"dfb00dc3-c708-4239-a8c6-4587fbfc21f2","Type":"ContainerStarted","Data":"081367a14a40dd05dda2d4b25442f65cbc4293ee4767aeef4de651ab428da74b"} Apr 16 20:27:14.247521 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:14.247472 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:14.264340 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:14.264293 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" podStartSLOduration=2.264276512 podStartE2EDuration="2.264276512s" podCreationTimestamp="2026-04-16 20:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:14.262947547 +0000 UTC m=+1998.463657625" watchObservedRunningTime="2026-04-16 20:27:14.264276512 +0000 UTC m=+1998.464986590" Apr 16 20:27:14.338376 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:14.338344 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-59d85686d4-p9r27_fe910aa8-0571-400f-abf4-cac8e9f4db52/registry/0.log" Apr 16 20:27:14.380210 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:14.380189 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b88gw_fe1490d8-1e7e-4156-9765-d2fec8a38446/node-ca/0.log" Apr 16 20:27:15.680153 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:15.680119 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9662b_e4e2129f-d37a-4277-bfd3-5be4dbf86524/serve-healthcheck-canary/0.log" Apr 16 20:27:16.165122 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:16.165094 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ck74z_36e8e17e-1c2e-409c-b778-fb3de7ae8bb2/kube-rbac-proxy/0.log" Apr 16 20:27:16.186665 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:16.186636 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ck74z_36e8e17e-1c2e-409c-b778-fb3de7ae8bb2/exporter/0.log" Apr 16 20:27:16.209165 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:16.209135 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ck74z_36e8e17e-1c2e-409c-b778-fb3de7ae8bb2/extractor/0.log" Apr 16 20:27:19.522762 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:19.522736 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-659c8cbdc-z5xll_88c60637-16c9-442b-badf-a5f053bc0cdf/manager/0.log" Apr 16 20:27:19.603029 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:19.602993 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-46vnv_f12ecef8-38fa-40e9-8bc9-14c24a45629e/server/0.log" Apr 16 20:27:19.809453 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:19.809369 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-66fmq_9cdece77-36f4-4852-8e16-243f458ee895/manager/0.log" Apr 16 20:27:19.835635 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:19.835612 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jd2c5_e074c26e-4516-4e8d-81a2-476b027fe153/s3-init/0.log" Apr 16 20:27:19.868558 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:19.868528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-j2rp7_071e9ff3-ae9d-4c24-8f40-eb1290ff97ae/seaweedfs/0.log" Apr 16 20:27:20.262964 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:20.262934 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-zsrpx" Apr 16 20:27:24.591451 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:24.591416 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8gcb2_37649370-90f5-4614-8ee1-1d712cdc28ee/migrator/0.log" Apr 16 20:27:24.611545 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:24.611509 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8gcb2_37649370-90f5-4614-8ee1-1d712cdc28ee/graceful-termination/0.log" Apr 16 20:27:25.963806 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:25.963778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/kube-multus-additional-cni-plugins/0.log" Apr 16 20:27:25.989297 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:25.989270 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/egress-router-binary-copy/0.log" Apr 16 20:27:26.010236 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.010216 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/cni-plugins/0.log" Apr 16 20:27:26.034000 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.033980 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/bond-cni-plugin/0.log" Apr 16 20:27:26.056155 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.056134 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/routeoverride-cni/0.log" Apr 16 20:27:26.078384 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.078362 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/whereabouts-cni-bincopy/0.log" Apr 16 20:27:26.100418 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.100391 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftsjl_0053bd34-312c-4064-8485-b10a2b3b16d7/whereabouts-cni/0.log" Apr 16 20:27:26.445304 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.445268 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-l22xw_3609ef37-ae6e-4910-8c8f-420611d9ef42/kube-multus/0.log" Apr 16 20:27:26.531629 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.531589 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bkkfh_ae8d7aa1-7c00-44df-9570-4435defaddc2/network-metrics-daemon/0.log" Apr 16 20:27:26.551070 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:26.551044 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bkkfh_ae8d7aa1-7c00-44df-9570-4435defaddc2/kube-rbac-proxy/0.log" Apr 16 20:27:27.912133 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:27.912103 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-controller/0.log" Apr 16 20:27:27.928729 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:27.928697 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/0.log" Apr 16 20:27:27.937188 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:27.937158 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovn-acl-logging/1.log" Apr 16 20:27:27.955562 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:27.955535 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/kube-rbac-proxy-node/0.log" Apr 16 20:27:27.975474 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:27.975415 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:27:27.994431 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:27.994410 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/northd/0.log" Apr 16 20:27:28.017808 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:28.017778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/nbdb/0.log" Apr 16 20:27:28.038584 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:28.038554 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/sbdb/0.log" Apr 16 20:27:28.163957 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:28.163853 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt9lg_05bff05d-2c89-41be-b7ea-03dd408b9294/ovnkube-controller/0.log" Apr 16 20:27:29.258187 ip-10-0-138-142 kubenswrapper[2569]: I0416 20:27:29.258144 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dk2hk_33559bf7-25ac-4de7-a712-253f87279cbf/network-check-target-container/0.log"